Analysis of operational comfort in manual tasks using human force manipulability measure.
Tanaka, Yoshiyuki; Nishikawa, Kazuo; Yamada, Naoki; Tsuji, Toshio
2015-01-01
This paper proposes a scheme for human force manipulability (HFM) based on the use of isometric joint torque properties to simulate the spatial characteristics of human operation forces at an end-point of a limb with feasible magnitudes for a specified limb posture. This is also applied to the evaluation/prediction of operational comfort (OC) when manually operating a human-machine interface. The effectiveness of HFM is investigated through two experiments and computer simulations of humans generating forces by using their upper extremities. Operation force generation with maximum isometric effort can be roughly estimated with an HFM measure computed from information on the arm posture during a maintained posture. The layout of a human-machine interface is then discussed based on the results of operational experiments using an electric gear-shifting system originally developed for robotic devices. The results indicate a strong relationship between the spatial characteristics of the HFM and OC levels when shifting, and the OC is predicted by using a multiple regression model with HFM measures.
Guiraldelli, Michel F.; Eyster, Craig; Wilkerson, Joseph L.; Dresser, Michael E.; Pezza, Roberto J.
2013-01-01
Faithful chromosome segregation during meiosis requires that homologous chromosomes associate and recombine. Chiasmata, the cytological manifestation of recombination, provide the physical link that holds the homologs together as a pair, facilitating their orientation on the spindle at meiosis I. Formation of most crossover (CO) events requires the assistance of a group of proteins collectively known as ZMM. HFM1/Mer3 is in this group of proteins and is required for normal progression of homologous recombination and proper synapsis between homologous chromosomes in a number of model organisms. Our work is the first study in mammals showing the in vivo function of mouse HFM1. Cytological observations suggest that initial steps of recombination are largely normal in a majority of Hfm1−/− spermatocytes. Intermediate and late stages of recombination appear aberrant, as chromosomal localization of MSH4 is altered and formation of MLH1foci is drastically reduced. In agreement, chiasma formation is reduced, and cells arrest with subsequent apoptosis at diakinesis. Our results indicate that deletion of Hfm1 leads to the elimination of a major fraction but not all COs. Formation of chromosome axial elements and homologous pairing is apparently normal, and Hfm1−/− spermatocytes progress to the end of prophase I without apparent developmental delay or apoptosis. However, synapsis is altered with components of the central region of the synaptonemal complex frequently failing to extend the full length of the chromosome axes. We propose that initial steps of recombination are sufficient to support homology recognition, pairing, and initial chromosome synapsis and that HFM1 is required to form normal numbers of COs and to complete synapsis. PMID:23555294
Origins of the high flux hohlraum model
NASA Astrophysics Data System (ADS)
Rosen, M. D.; Hinkel, D. E.; Williams, E. A.; Callahan, D. A.; Town, R. P. J.; Scott, H. A.; Kruer, W. L.; Suter, L. J.
2010-11-01
We review how the ``high flux model'' (HFM) helped clarify the performance of the Autumn 09 National Ignition Campaign (NIC) gas filled/capsule imploding hohlraum energetics campaign. This campaign showed good laser-hohlraum coupling, reasonably high drive, and implosion symmetry control via cross beam transfer. Mysteries that remained included the level and spectrum of the Stimulated Raman light, the tendency towards pancaked implosions, and drive that exceeded (standard model) predictions early in the campaign, and lagged those predictions late in the campaign. The HFM uses a detailed configuration accounting (DCA) atomic physics and a generous flux limiter (f=0.2) both of which contribute to predicting a hohlraum plasma that is cooler than the standard, XSN average atom, f=0.05 model. This cooler plasma proved to be key in solving all of those mysteries. Despite past successes of the HFM in correctly modeling Omega Laser Au sphere data and NIC empty hohlraum drive, the model lacked some credibility for this energetics campaign, because it predicted too much hohlraum drive. Its credibility was then boosted by a re-evaluation of the initially reported SRS levels.
PREFACE: The International Conference on Highly Frustrated Magnetism HFM2008
NASA Astrophysics Data System (ADS)
Eremin, Ilya; Brenig, Wolfram; Kremer, Reinhard; Litterst, Jochen
2009-01-01
The International Conference on Highly Frustrated Magnetism 2008 (HFM2008) took place on 7-12 September 2008 at the Technische Universität Carolo-Wilhelmina zu Braunschweig, Germany. This conference was the fourth event in a series of meetings, which started in Waterloo, Canada (HFM 2000), followed by the second one in Grenoble, France (HFM 2003), and the third meeting in Osaka, Japan (HFM 2006). HFM2008 attracted more than 220 participants from all over the world. The number of participants of the HFM conference series has been increasing steadily, from about 80 participants at HFM 2000, to 120 participants at HFM 2003, and 190 participants at HFM 2006, demonstrating that highly frustrated magnetism remains a rapidly growing area of research in condensed matter physics. At the end of HFM2008 it was decided that the next International Conference on Highly Frustrated Magnetism will be held in Baltimore, USA in 2010. HFM2008 saw four plenary talks by R Moessner, S Nakatsuji, S-W Cheong, and S Sachdev, 18 invited presentations, 30 contributed talks and about 160 poster presentations from all areas of frustrated magnetism. The subjects covered by the conference included: Kagome systems Itinerant frustrated systems Spinels and pyrochlore materials Triangular systems Unconventional order and spin liquids Chain systems Chain systems Novel frustrated systems This volume of Journal of Physics: Conference Series contains the proceedings of HFM2008 with 83 papers that provide a scientific record of the scientific topics covered by the conference. All articles have been refereed by experts in the field. It is our hope that the reader will enjoy and profit from the HFM2008 Proceedings. Ilya Eremin Proceedings Editor Wolfram Brenig, Reinhard Kremer, and Jochen Litterst Co-Editors International Advisory Board L Balents (USA) F Becca (Italy) S Bramwell (UK) P Fulde (Germany) B D Gaulin (Canada) J E Greedan (Canada) A Harrison (France) Z Hiroi (Japan) H Kawamura (Japan) A Keren (Israel) C Lacroix (France) C Lhuillier (France) A Loidl (Germany) G Misguich (France) J Richter (Germany) A M Olés (Poland) P Schiffer (USA) R Stern (Estonia) O Tchernyshyov (USA) M R. Valenti (Germany) G Zwicknagl (Germany) International Program Committee W Brenig (Germany) C Broholm (USA) M Gingras (Canada) K Ueda (Japan) P Mendels (France) F Mila (Switzerland) R Moessner (Germany) Conference photograph On behalf of the HFM2008 Organizing Committee, I wish to express my sincere thanks to everyone who supported us in organizing and setting up HFM2008. Especially, I would like to thank the European Science Foundation and the Max-Planck-Institut für Festkörperforschung for the generous financial support, the Technische Universität Braunschweig and the City of Braunschweig for hosting the conference, all colleagues who served on the Advisory and Program Board, the Referees, and last but not least, the staff members from Stuttgart and Braunschweig, Mrs Gisela Siegle, Mrs Regine Noack and Mrs Katharina Schnettler, for their helpful and invaluable assistance. Reinhard Kremer
2011-02-01
NORTH ATLANTIC TREATY ORGANISATION RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-129)TP/349 www.rto.nato.int RTO TECHNICAL REPORT TR...Final Report of Task Group HFM-129. Published February 2011 Distribution and Availability on Back Cover...NORTH ATLANTIC TREATY ORGANISATION RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-129)TP/349 www.rto.nato.int RTO TECHNICAL REPORT TR-HFM
Review of Military Mountain Medicine Technology and Research Barriers
2011-09-01
NORTH ATLANTIC TREATY ORGANISATION RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-146)TP/387 www.rto.nato.int RTO TECHNICAL REPORT TR...montagne et les freins à la recherche) Final Report of Task Group HFM-146. Published September 2011 Distribution and Availability...323(HFM-146)TP/387 www.rto.nato.int RTO TECHNICAL REPORT TR-HFM-146 Review of Military Mountain Medicine Technology and Research Barriers (Point
Optimizing Operational Physical Fitness (Optimisation de L’Aptitude Physique Operationnelle)
2009-01-01
NORTH ATLANTIC TREATY ORGANISATION RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-080)TP/200 www.rto.nato.int RTO TECHNICAL REPORT TR... RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-080)TP/200 www.rto.nato.int RTO TECHNICAL REPORT TR-HFM-080 Optimizing Operational Physical...Fitness (Optimisation de l’aptitude physique opérationnelle) Final Report of Task Group 019. ii RTO-TR-HFM-080 The Research and
Peluso, Ilaria; Villano, Debora V; Roberts, Susan A; Cesqui, Eleonora; Raguzzini, Anna; Borges, Gina; Crozier, Alan; Catasta, Giovina; Toti, Elisabetta; Serafini, Mauro
2014-01-01
Postprandial stress induced by acute consumption of meals with a high fat content results in an increase of markers of cardiometabolic risk. Repeated acute dietary stress may induce a persistent low-grade inflammation, playing a role in the pathogenesis of functional gut diseases. This may cause an impairment of the complex immune response of the gastrointestinal mucosa, which results in a breakdown of oral tolerance. We investigated the effect of ingestion of a fruit-juice drink (FJD) composed by multiple fruit juice and extracts, green tea extracts and vitamin C on postprandial stress induced by a High Fat Meal (HFM) in healthy overweight subjects. Following a double blind, placebo controlled, cross-over design, 15 healthy overweight subjects were randomized to a HFM providing 1334 Kcal (55% fat, 30% carbohydrates and 15% proteins) in combination with 500 mL of a placebo drink (HFM-P) or a fruit-juice drink (HFM-FJD). Ingestion of HFM-P led to an increase in circulating levels of cholesterol, triglycerides, glucose, insulin, TNF-α and IL-6. Ingestion of HFM-FJD significantly reduced plasma levels of cholesterol and triglycerides, decreasing inflammatory response mediated by TNF-α and IL-6. Ingestion of a fruit-juice drink reduce markers of postprandial stress induced by a HFM.
Hemifacial microsomia: from gestation to childhood.
Werler, Martha M; Starr, Jacqueline R; Cloonan, Yona K; Speltz, Matthew L
2009-03-01
Hemifacial microsomia (HFM) is a variable, complex malformation involving asymmetric hypoplasia of the face and ear. Little is known about the risk factors for or consequences of HFM. In this study, we describe 3 studies that have been or are currently being conducted to further our understanding of this malformation. The first completed study examined whether HFM risk is related to maternal exposures that may affect blood flow. In that case-control study, interview data from 230 mothers of children in the case group and 678 mothers of children in the control group suggested that maternal use of vasoactive medications in the first trimester, particularly in combination with cigarette smoking, was associated with increased risks of HFM. The second study is currently underway, in which we are evaluating whether HFM risk is related to genetic variation in pathways associated with vasculogenesis and hemostasis, using DNA collected in the first study. The third ongoing study observes children with HFM to identify psychosocial, cognitive, dental, and medical sequelae. When the children from the original case-control study are 6 or 7 years of age, mothers and teachers complete self-administered questionnaires that cover a wide range of psychosocial development domains. Preliminary analyses of 115 case and 314 control children suggest that children with HFM may have worse teacher-reported academic performance and possibly higher levels of internalizing behavior problems than control children. When data on the full study sample are available, further analyses will determine whether the preliminary findings remain and if they vary by HFM phenotype, parenting style, or indicators of social risk.
Hydrogeologic framework of the middle San Pedro watershed, southeastern Arizona
Dickinson, Jesse; Kennedy, Jeffrey R.; Pool, D.R.; Cordova, Jeffrey T.; Parker, John T.; Macy, J.P.; Thomas, Blakemore
2010-01-01
Water managers in rural Arizona are under increasing pressure to provide sustainable supplies of water despite rapid population growth and demands for environmental protection. This report describes the results of a study of the hydrogeologic framework of the middle San Pedro watershed. The components of this report include: (1) a description of the geologic setting and depositional history of basin fill sediments that form the primary aquifer system, (2) updated bedrock altitudes underlying basin fill sediments calculated using a subsurface density model of gravity data, (3) delineation of hydrogeologic units in the basin fill using lithologic descriptions in driller's logs and models of airborne electrical resistivity data, (4) a digital three-dimensional (3D) hydrogeologic framework model (HFM) that represents spatial extents and thicknesses of the hydrogeologic units (HGUs), and (5) description of the hydrologic properties of the HGUs. The lithologic interpretations based on geophysical data and unit thickness and extent of the HGUs included in the HFM define potential configurations of hydraulic zones and parameters that can be incorporated in groundwater-flow models. The hydrogeologic framework comprises permeable and impermeable stratigraphic units: (1) bedrock, (2) sedimentary rocks predating basin-and-range deformation, (3) lower basin fill, (4) upper basin fill, and (5) stream alluvium. The bedrock unit includes Proterozoic to Cretaceous crystalline rocks, sedimentary rocks, and limestone that are relatively impermeable and poor aquifers, except for saturated portions of limestone. The pre-basin-and-range sediments underlie the lower basin fill but are relatively impermeable owing to cementation. However, they may be an important water-bearing unit where fractured. Alluvium of the lower basin fill, the main water-bearing unit, was deposited in the structural trough between the uplifted ridges of bedrock and (or) pre-basin-and-range sediments. Alluvium of the upper basin fill may be more permeable than the lower basin fill, but it is generally unsaturated in the study area. The lower basin fill stratigraphic unit was delineated into three HGUs on the basis of lithologic descriptions in driller?s logs and one-dimensional (1D) electrical models of airborne transient electromagnetic (TEM) surveys. The interbedded lower basin fill (ILBF) HGU represents an upper sequence having resistivity values between 5 and 40 ohm-m identified as interbedded sand, gravel, and clay in driller?s logs. Below this upper sequence, fine-grained lower basin fill (FLBF) HGU represents a thick silt and clay sequence having resistivity values between 5 and 20 ohm-m. Within the coarse-grained lower basin fill (CLBF) HGU, which underlies the silt and clay of the FLBF, the resistivity values on logs and 1D models increase to several hundred ohm-m and are highly variable within sand and gravel layers. These sequences match distinct resistivity and lithologic layers identified by geophysical logs in the adjacent Sierra Vista subwatershed, suggesting that these sequences are laterally continuous within both the Benson and Sierra Vista subwatersheds in the Upper San Pedro Basin. A subsurface density model based on gravity data was constructed to identify the top of bedrock and structures that may affect regional groundwater flow. The subsurface density model contains six layers having uniform density values, which are assigned on the basis of geophysical logs. The density values for the layers range between 1.65 g/cm3 for unsaturated sediments near the land surface and 2.67 g/cm3 for bedrock. Major features include three subbasins within the study area, the Huachuca City subbasin, the Tombstone subbasin, and the Benson subbasin, which have no expression in surface topography or lithology. Bedrock altitudes from the subsurface density model defined top altitudes of the bedrock HGU. The HFM includes the following HGUs in ascending stratigr
Johnson, Ariel M; Kurti, Stephanie P; Smith, Joshua R; Rosenkranz, Sara K; Harms, Craig A
2016-03-01
A high-fat meal (HFM) induces an increase in blood lipids (postprandial lipemia; PPL), systemic inflammation, and acute airway inflammation. While acute exercise has been shown to have anti-inflammatory and lipid-lowering effects, it is unknown whether exercise prior to an HFM will translate to reduced airway inflammation post-HFM. Our purpose was to determine the effects of an acute bout of exercise on airway inflammation post-HFM and to identify whether any protective effect of exercise on airway inflammation was associated with a reduction in PPL or systemic inflammation. In a randomized cross-over study, 12 healthy, 18- to 29-year-old men (age, 23.0 ± 3.2 years; height, 178.9 ± 5.5 cm; weight, 78.5 ± 11.7 kg) consumed an HFM (1 g fat/1 kg body weight) 12 h following exercise (EX; 60 min at 60% maximal oxygen uptake) or without exercise (CON). Fractional exhaled nitric oxide (FENO; measure of airway inflammation), triglycerides (TG), and inflammatory markers (high-sensitivity C-reactive protein, tumor-necrosis factor-alpha, and interleukin-6) were measured while fasted at 2 h and 4 h post-HFM. FENO increased over time (2 h: CON, p = 0.001; EX, p = 0.002, but not by condition (p = 0.991). TG significantly increased 2 and 4 h post-HFM (p < 0.001), but was not significant between conditions (p = 0.256). Inflammatory markers did not significantly increase by time or condition (p > 0.05). There were no relationships between FENO and TG or systemic inflammatory markers for any time point or condition (p > 0.05). In summary, an acute bout of moderate-intensity exercise performed 12 h prior to an HFM did not change postprandial airway inflammation or lipemia in healthy, 18- to 29-year-old men.
Prior Consumption of a Fat Meal in Healthy Adults Modulates the Brain’s Response to Fat123
Eldeghaidy, Sally; Hort, Joanne; Hollowood, Tracey; Singh, Gulzar; Bush, Debbie; Foster, Tim; Taylor, Andy J; Busch, Johanneke; Spiller, Robin C
2016-01-01
Background: The consumption of fat is regulated by reward and homeostatic pathways, but no studies to our knowledge have examined the role of high-fat meal (HFM) intake on subsequent brain activation to oral stimuli. Objective: We evaluated how prior consumption of an HFM or water load (WL) modulates reward, homeostatic, and taste brain responses to the subsequent delivery of oral fat. Methods: A randomized 2-way crossover design spaced 1 wk apart was used to compare the prior consumption of a 250-mL HFM (520 kcal) [rapeseed oil (440 kcal), emulsifier, sucrose, flavor cocktail] or noncaloric WL on brain activation to the delivery of repeated trials of a flavored no-fat control stimulus (CS) or flavored fat stimulus (FS) in 17 healthy adults (11 men) aged 25 ± 2 y and with a body mass index (in kg/m2) of 22.4 ± 0.8. We tested differences in brain activation to the CS and FS and baseline cerebral blood flow (CBF) after the HFM and WL. We also tested correlations between an individual’s plasma cholecystokinin (CCK) concentration after the HFM and blood oxygenation level–dependent (BOLD) activation of brain regions. Results: Compared to the WL, consuming the HFM led to decreased anterior insula taste activation in response to both the CS (36.3%; P < 0.05) and FS (26.5%; P < 0.05). The HFM caused reduced amygdala activation (25.1%; P < 0.01) in response to the FS compared to the CS (fat-related satiety). Baseline CBF significantly reduced in taste (insula: 5.7%; P < 0.01), homeostatic (hypothalamus: 9.2%, P < 0.01; thalamus: 5.1%, P < 0.05), and reward areas (striatum: 9.2%; P < 0.01) after the HFM. An individual’s plasma CCK concentration correlated negatively with brain activation in taste and oral somatosensory (ρ = −0.39; P < 0.05) and reward areas (ρ = −0.36; P < 0.05). Conclusions: Our results in healthy adults show that an HFM suppresses BOLD activation in taste and reward areas compared to a WL. This understanding will help inform the reformulation of reduced-fat foods that mimic the brain’s response to high-fat counterparts and guide future interventions to reduce obesity. PMID:27655761
Prior Consumption of a Fat Meal in Healthy Adults Modulates the Brain's Response to Fat.
Eldeghaidy, Sally; Marciani, Luca; Hort, Joanne; Hollowood, Tracey; Singh, Gulzar; Bush, Debbie; Foster, Tim; Taylor, Andy J; Busch, Johanneke; Spiller, Robin C; Gowland, Penny A; Francis, Susan T
2016-11-01
The consumption of fat is regulated by reward and homeostatic pathways, but no studies to our knowledge have examined the role of high-fat meal (HFM) intake on subsequent brain activation to oral stimuli. We evaluated how prior consumption of an HFM or water load (WL) modulates reward, homeostatic, and taste brain responses to the subsequent delivery of oral fat. A randomized 2-way crossover design spaced 1 wk apart was used to compare the prior consumption of a 250-mL HFM (520 kcal) [rapeseed oil (440 kcal), emulsifier, sucrose, flavor cocktail] or noncaloric WL on brain activation to the delivery of repeated trials of a flavored no-fat control stimulus (CS) or flavored fat stimulus (FS) in 17 healthy adults (11 men) aged 25 ± 2 y and with a body mass index (in kg/m 2 ) of 22.4 ± 0.8. We tested differences in brain activation to the CS and FS and baseline cerebral blood flow (CBF) after the HFM and WL. We also tested correlations between an individual's plasma cholecystokinin (CCK) concentration after the HFM and blood oxygenation level-dependent (BOLD) activation of brain regions. Compared to the WL, consuming the HFM led to decreased anterior insula taste activation in response to both the CS (36.3%; P < 0.05) and FS (26.5%; P < 0.05). The HFM caused reduced amygdala activation (25.1%; P < 0.01) in response to the FS compared to the CS (fat-related satiety). Baseline CBF significantly reduced in taste (insula: 5.7%; P < 0.01), homeostatic (hypothalamus: 9.2%, P < 0.01; thalamus: 5.1%, P < 0.05), and reward areas (striatum: 9.2%; P < 0.01) after the HFM. An individual's plasma CCK concentration correlated negatively with brain activation in taste and oral somatosensory (ρ = -0.39; P < 0.05) and reward areas (ρ = -0.36; P < 0.05). Our results in healthy adults show that an HFM suppresses BOLD activation in taste and reward areas compared to a WL. This understanding will help inform the reformulation of reduced-fat foods that mimic the brain's response to high-fat counterparts and guide future interventions to reduce obesity.
NASA Astrophysics Data System (ADS)
Yu, Z. B.; Li, Q.; Chen, X.; Guo, F. Z.; Xie, X. J.; Wu, J. H.
2003-12-01
The purpose of this paper is to investigate the stability of oscillation modes in a thermoacoustic Stirling prime mover, which is a combination of looped tube and resonator. Two modes, with oscillation frequencies of 76 and 528 Hz, have been observed, stabilities of which are widely different. The stability of the high frequency mode (HFM) is affected by low frequency mode (LFM) strongly. Once the LFM is excited when the HFM is present, the HFM will be gradually slaved and suppressed by the LFM. The details of the transition from HFM to LFM have been described. The two stability curves of the two modes have been measured. Mean pressure Pm is an important control parameter influencing the mode stability in the tested system.
Nonsurgical Treatment of Hemifacial Microsomia: A Case Report.
Nouri, Mahtab; Farzan, Arash
2015-11-01
Hemifacial microsomia (HFM) is a birth defect involving craniofacial structures derived from the first and second branchial arches. Although it is a relatively uncommon malformation, it is the second most common craniofacial birth defect after cleft lip and palate (CL/P). This is a case report about the successful orthodontic treatment of a patient with mild hemifacial microsomia (HFM), using a non-surgical orthopedic and orthodontic treatment approach. The aim of this approach was to make the best noninvasive modality to treat HFM. A 7-year-old boy with a mild HFM presented with a convex profile and slight chin deviation. Orthopedic treatment performed using a hybrid functional and high pulls headgear. Treatment continued by fixed orthodontic straight wire appliance to achieve perfect occlusion. Excellent esthetic and functional results achieved; total treatment duration was about 72 months.
Exercise intensity and the protection from postprandial vascular dysfunction in adolescents.
Bond, B; Gates, P E; Jackman, S R; Corless, L M; Williams, C A; Barker, A R
2015-06-01
Acute exercise transiently improves endothelial function and protects the vasculature from the deleterious effects of a high-fat meal (HFM). We sought to identify whether this response is dependent on exercise intensity in adolescents. Twenty adolescents (10 male, 14.3 ± 0.3 yr) completed three 1-day trials: 1) rest (CON); 2) 8 × 1 min cycling at 90% peak power with 75 s recovery [high-intensity interval exercise (HIIE)]; and 3) cycling at 90% of the gas exchange threshold [moderate-intensity exercise (MIE)] 1 h before consuming a HFM (1.50 g/kg fat). Macrovascular and microvascular endothelial function was assessed before and immediately after exercise and 3 h after the HFM by flow-mediated dilation (FMD) and laser Doppler imaging [peak reactive hyperemia (PRH)]. FMD and PRH increased 1 h after HIIE [P < 0.001, effect size (ES) = 1.20 and P = 0.048, ES = 0.56] but were unchanged after MIE. FMD and PRH were attenuated 3 h after the HFM in CON (P < 0.001, ES = 1.78 and P = 0.02, ES = 0.59). FMD remained greater 3 h after the HFM in HIIE compared with MIE (P < 0.001, ES = 1.47) and CON (P < 0.001, ES = 2.54), and in MIE compared with CON (P < 0.001, ES = 1.40). Compared with CON, PRH was greater 3 h after the HFM in HIIE (P = 0.02, ES = 0.71) and MIE (P = 0.02, ES = 0.84), with no differences between HIIE and MIE (P = 0.72, ES = 0.16). Plasma triacylglycerol concentration and total antioxidant status concentration were not different between trials. We conclude that exercise intensity plays an important role in protecting the vasculature from the deleterious effects of a HFM. Performing HIIE may provide superior vascular benefits than MIE in adolescent groups. Copyright © 2015 the American Physiological Society.
Impact of folate therapy on combined immunodeficiency secondary to hereditary folate malabsorption.
Kishimoto, Kenji; Kobayashi, Ryoji; Sano, Hirozumi; Suzuki, Daisuke; Maruoka, Hayato; Yasuda, Kazue; Chida, Natsuko; Yamada, Masafumi; Kobayashi, Kunihiko
2014-07-01
Hereditary folate malabsorption (HFM) is a rare autosomal recessive disorder. Severe folate deficiency in HFM can result in immunodeficiency. We describe a female infant with HFM who acquired severe Pneumocystis pneumonia. The objective of the present study was to elucidate her immunological phenotype and to examine the time course of immune recovery following parenteral folate therapy. The patient demonstrated a combined immunodeficiency with an impaired T cell proliferation response, pan-hypogammaglobulinemia, and an imbalanced pro-inflammatory cytokine profile. She had normal white blood cell count, normal lymphocyte subsets, and normal complement levels. Two novel mutations were identified within the SLC46A1 gene to produce a compound heterozygote. We confirmed full recovery of her immunological and neurophysiological status with parenteral folate replacement. The time course of recovery of her immunological profile varied widely, however. HFM should be recognized as a unique form of immunodeficiency. Copyright © 2014 Elsevier Inc. All rights reserved.
Oh, Heung-Il; Ye, Sang-Ho; Johnson, Carl A.; Woolley, Joshua R.; Federspiel, William J.; Wagner, William R.
2011-01-01
Hollow fiber membrane (HFM)-based artificial lungs can require a large blood-contacting membrane surface area to provide adequate gas exchange. However, such a large surface area presents significant challenges to hemocompatibility. One method to improve carbon dioxide (CO2) transfer efficiency might be to immobilize carbonic anhydrase (CA) onto the surface of conventional HFMs. By catalyzing the dehydration of bicarbonate in blood, CA has been shown to facilitate diffusion of CO2 toward the fiber membranes. This study evaluated the impact of surface modifying a commercially available microporous HFM-based artificial lung on fiber blood biocompatibility. A commercial poly(propylene) Celgard HFM surface was coated with a siloxane, grafted with amine groups, and then attached with CA which has been shown to facilitate diffusion of CO2 toward the fiber membranes. Results following acute ovine blood contact indicated no significant reduction in platelet deposition or activation with the siloxane coating or the siloxane coating with grafted amines relative to base HFMs. However,HFMs with attached CA showed a significant reduction in both platelet deposition and activation compared with all other fiber types. These findings, along with the improved CO2 transfer observed in CA modified fibers, suggest that its incorporation into HFM design may potentiate the design of a smaller, more biocompatible HFM-based artificial lung. PMID:20633159
The 8th International Conference on Highly Frustrated Magnetism (HFM 2016)
NASA Astrophysics Data System (ADS)
Gardner, J. S.; Kao, Y. J.
2017-04-01
The 8th International Conference on Highly Frustrated Magnetism 2016 (HFM 2016) took place between the 7th and 11th of September 2016 at the GIS Convention Center at National Taiwan University, Taipei, Taiwan. Over 260 participants from all over the world, attended the meeting making it the largest HFM to-date and revealing the impressive growth in the community since the original meeting in Waterloo, Canada where 80 participants attended. Preceding the meeting a school was held at the National Synchrotron Radiation Research Center to help those new to the field understand the material they were likely to see at HFM2016. Our thanks to the international speakers who attended this school John Chalker, Michel Kenzelmann, Philippe Mendels, Luigi Paolasini, Kirrily Rule, Yixi Su, Isao Watanabe and those from Taiwan W. T. Chen, Y-J, Kao, L. J. Chang and C. S. Ku, for their enlightening presentations. The HFM 2016 conference consisted of five plenary talks by H Takagi, B D Gaulin, L Balents, Y Tokura and S T Bramwell, 20 invited and 40 contributed presentations, and about 160 poster presentations from all aspects of theoretical and experimental frustrated magnetism. During the conference period, many stimulating discussions were held both inside and outside the conference room. Excursions to Taipei 101 and the National Palace Museum, as well as several organized dinners and receptions allowed the participants to initiate collaborations and discuss the hottest issues. The subjects covered in the conference included: · Quantum frustrated magnetism and spin liquids · Novel ordering of geometrically frustrated magnets · Frustration effect on the coupling to lattice, orbital and charge degrees of freedom · Exotic phenomena induced by macroscopic degeneracy · Field effect on frustrated magnetism etc. These proceeding represent a very small, but valuable contribution to the community. I hope you enjoy reading them. In view of the rapid growth of the field, it has been decided at the conference that the next HFM conference will be held in two years, in 2018, on the campus of UC Davis, followed one in Shanghai around December 2019. Finally, on behalf of the HFM 2016 Organizing Committee, I wish to deeply thank all the people who generously helped us in organizing and running the HFM 2016 conference. These include the numerous funding sources, Committee Members, Program and Session chairs, Chia-Chi Liu, Chao-Jung Kuo, Hanz Peng, Laura Bravo and Stella Su and of course all the participants, without you it would have simply been another wet week in Taipei. One might remember HFM2016 for the power outage, but we hope you remember the taste, sounds and views of Taipei as well as stimulation conversations and hopefully the beginnings of productive partnerships.
NASA Astrophysics Data System (ADS)
Agata, R.; Ichimura, T.; Hori, T.; Hirahara, K.; Hashimoto, C.; Hori, M.
2016-12-01
Estimation of the coseismic/postseismic slip using postseismic deformation observation data is an important topic in the field of geodetic inversion. Estimation methods for this purpose are expected to be improved by introducing numerical simulation tools (e.g. finite element (FE) method) of viscoelastic deformation, in which the computation model is of high fidelity to the available high-resolution crustal data. The authors have proposed a large-scale simulation method using such FE high-fidelity models (HFM), assuming use of a large-scale computation environment such as the K computer in Japan (Ichimura et al. 2016). On the other hand, the values of viscosity in the heterogeneous viscoelastic structure in the high-fidelity model are not trivial. In this study, we developed an adjoint-based optimization method incorporating HFM, in which fault slip and asthenosphere viscosity are simultaneously estimated. We carried out numerical experiments using synthetic crustal deformation data. We constructed an HFM in the domain of 2048x1536x850 km, which includes the Tohoku region in northeast Japan based on Ichimura et al. (2013). We used the model geometry data set of JTOPO30 (2003), Koketsu et al. (2008) and CAMP standard model (Hashimoto et al. 2004). The geometry of crustal structures in HFM is in 1km resolution, resulting in 36 billion degrees-of-freedom. Synthetic crustal deformation data due to prescribed coseismic slip and after slips in the location of GEONET, GPS/A observation points, and S-net are used. The target inverse analysis is formulated as minimization of L2 norm of the difference between the FE simulation results and the observation data with respect to viscosity and fault slip, combining the quasi-Newton algorithm with the adjoint method. Use of this combination decreases the necessary number of forward analyses in the optimization calculation. As a result, we are now able to finish the estimation using 2560 computer nodes of the K computer for less than 17 hours. Thus, the target inverse analysis is completed in a realistic time because of the combination of the fast solver and the adjoint method. In the future, we would like to apply the method to the actual data.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... and Development (HFM-40), Center for Biologics Evaluation and Research (CBER), Food and Drug... CONTACT: Paul E. Levine, Jr., Center for Biologics Evaluation and Research (HFM-17), Food and Drug... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2011-N-0599...
Khoo, S M; Porter, J H; Edwards, G A; Charman, W N
1998-12-01
Halofantrine (Hf) is a highly lipophilic antimalarial with poor and erratic absorption. Published data indicates that the oral bioavailability of Hf was increased 3-fold in humans and 12-fold in dogs when administered postprandially; however, the proportional formation of the active desbutyl metabolite (desbutylhalofantrine, Hfm) decreased 2.4-fold in humans and 6.8-fold in dogs (Milton et al., Br. J. Clin. Pharmacol. 1989, 28, 71-77; Humberstone et al., J. Pharm. Sci. 1996, 85, 525-529). The current study was undertaken to confirm the putative involvement of CYP3A4 in the N-dealkylation of Hf to Hfm by administering Hf with and without ketoconazole (KC), a specific CYP3A4 inhibitor, and measuring the resulting plasma concentration profiles of Hf and Hfm. The plasma Hfm/Hf AUC(0-72 h) ratio after fasted oral administration of Hf without KC was 0.56, whereas the ratio after fasted oral administration with KC was less than 0.05. It is likely that both hepatic and prehepatic (enterocyte-based) CYP3A4 contributed to metabolism of Hf to Hfm after oral administration. Interestingly, the low plasma Hfm/Hf AUC ratios observed after fasted administration of Hf with KC were similar to the low values previously observed when Hf was administered postprandially (despite increased Hf absorption). The mechanism(s) by which postprandial administration of Hf led to a decrease in its metabolism are unknown, but based on the current data, could include inhibition of CYP3A4-mediated metabolism by components of the ingested meal. Other possibilities include a lipid-induced postprandial recruitment of intestinal lymphatic transport or avoidance of metabolism during transport through the enterocyte into the portal blood. Further studies are required to determine the relative contributions by which these different processes may decrease the presystemic metabolism of Hf.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-17
... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...
Arazawa, D T; Kimmel, J D; Finn, M C; Federspiel, W J
2015-10-01
The use of extracorporeal carbon dioxide removal (ECCO2R) is well established as a therapy for patients suffering from acute respiratory failure. Development of next generation low blood flow (<500 mL/min) ECCO2R devices necessitates more efficient gas exchange devices. Since over 90% of blood CO2 is transported as bicarbonate (HCO3(-)), we previously reported development of a carbonic anhydrase (CA) immobilized bioactive hollow fiber membrane (HFM) which significantly accelerates CO2 removal from blood in model gas exchange devices by converting bicarbonate to CO2 directly at the HFM surface. This present study tested the hypothesis that dilute sulfur dioxide (SO2) in oxygen sweep gas could further increase CO2 removal by creating an acidic microenvironment within the diffusional boundary layer adjacent to the HFM surface, facilitating dehydration of bicarbonate to CO2. CA was covalently immobilized onto poly (methyl pentene) (PMP) HFMs through glutaraldehyde activated chitosan spacers, potted in model gas exchange devices (0.0151 m(2)) and tested for CO2 removal rate with oxygen (O2) sweep gas and a 2.2% SO2 in oxygen sweep gas mixture. Using pure O2 sweep gas, CA-PMP increased CO2 removal by 31% (258 mL/min/m(2)) compared to PMP (197 mL/min/m(2)) (P<0.05). Using 2.2% SO2 acidic sweep gas increased PMP CO2 removal by 17% (230 mL/min/m(2)) compared to pure oxygen sweep gas control (P<0.05); device outlet blood pH was 7.38 units. When employing both CA-PMP and 2.2% SO2 sweep gas, CO2 removal increased by 109% (411 mL/min/m(2)) (P<0.05); device outlet blood pH was 7.35 units. Dilute acidic sweep gas increases CO2 removal, and when used in combination with bioactive CA-HFMs has a synergistic effect to more than double CO2 removal while maintaining physiologic pH. Through these technologies the next generation of intravascular and paracorporeal respiratory assist devices can remove more CO2 with smaller blood contacting surface areas. A clinical need exists for more efficient respiratory assist devices which utilize low blood flow rates (<500 mL/min) to regulate blood CO2 in patients suffering from acute lung failure. Literature has demonstrated approaches to chemically increase hollow fiber membrane (HFM) CO2 removal efficiency by shifting equilibrium from bicarbonate to gaseous CO2, through either a bioactive carbonic anhydrase enzyme coating or bulk blood acidification with lactic acid. In this study we demonstrate a novel approach to local blood acidification using an acidified sweep gas in combination with a bioactive coating to more than double CO2 removal efficiency of HFM devices. To our knowledge, this is the first report assessing an acidic sweep gas to increase CO2 removal from blood using HFM devices. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Y.; Zheng, L.; Pau, G. S. H.
2016-12-01
A careful assessment of the risk associated with geologic CO2 storage is critical to the deployment of large-scale storage projects. While numerical modeling is an indispensable tool for risk assessment, there has been increasing need in considering and addressing uncertainties in the numerical models. However, uncertainty analyses have been significantly hindered by the computational complexity of the model. As a remedy, reduced-order models (ROM), which serve as computationally efficient surrogates for high-fidelity models (HFM), have been employed. The ROM is constructed at the expense of an initial set of HFM simulations, and afterwards can be relied upon to predict the model output values at minimal cost. The ROM presented here is part of National Risk Assessment Program (NRAP) and intends to predict the water quality change in groundwater in response to hypothetical CO2 and brine leakage. The HFM based on which the ROM is derived is a multiphase flow and reactive transport model, with 3-D heterogeneous flow field and complex chemical reactions including aqueous complexation, mineral dissolution/precipitation, adsorption/desorption via surface complexation and cation exchange. Reduced-order modeling techniques based on polynomial basis expansion, such as polynomial chaos expansion (PCE), are widely used in the literature. However, the accuracy of such ROMs can be affected by the sparse structure of the coefficients of the expansion. Failing to identify vanishing polynomial coefficients introduces unnecessary sampling errors, the accumulation of which deteriorates the accuracy of the ROMs. To address this issue, we treat the PCE as a sparse Bayesian learning (SBL) problem, and the sparsity is obtained by detecting and including only the non-zero PCE coefficients one at a time by iteratively selecting the most contributing coefficients. The computational complexity due to predicting the entire 3-D concentration fields is further mitigated by a dimension reduction procedure-proper orthogonal decomposition (POD). Our numerical results show that utilizing the sparse structure and POD significantly enhances the accuracy and efficiency of the ROMs, laying the basis for further analyses that necessitate a large number of model simulations.
Dentino, K M; Valstar, A; Padwa, B L
2016-06-01
The goal of this study was to describe the clinical characteristics and treatment outcomes of patients with hemifacial microsomia (HFM) and cleft lip/palate (CL/P), and to compare them to a historic cohort of patients with non-syndromic CL/P treated at the same centre. A retrospective review of patients with HFM and CL/P was performed; the main outcome measures assessed were cleft type/side, surgical outcome, midfacial retrusion, and speech. Twenty-six patients (13 male, 13 female; mean age 22.7±14.9, range 1-52 years) with cleft lip with/without cleft palate (CL±P) were identified: three with cleft lip (12%), two with cleft lip and alveolus and an intact secondary palate (8%), and 21 with cleft lip and palate (CLP) (81%; 15 unilateral and six bilateral). Four patients (19%) had a palatal fistula after palatoplasty. Twelve of 22 patients aged >5 years (55%) had midfacial retrusion and two (9%) required a pharyngeal flap for velopharyngeal insufficiency (VPI). Fisher's exact test demonstrated a higher frequency of complete labial clefting (P=0.004), CLP (P=0.009), midfacial retrusion (P=0.0009), and postoperative palatal fistula (P=0.03) in HFM compared to non-syndromic CL±P. There was no difference in VPI prevalence. This study revealed that patients with HFM and CL±P have more severe forms of orofacial clefting than patients with non-syndromic CL±P. Patients with HFM and CL±P have more severe midfacial retrusion and a higher palatal fistula rate compared to patients with non-syndromic CL±P. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Wagenaar, Kim P; Broekhuizen, Berna D L; Dickstein, Kenneth; Jaarsma, Tiny; Hoes, Arno W; Rutten, Frans H
2015-12-01
Electronic health support (e-health) may improve self-care of patients with heart failure (HF). We aim to assess whether an adjusted care pathway with replacement of routine consultations by e-health improves self-care as compared with usual care. In addition, we will determine whether the ESC/HFA (European Society of Cardiology/Heart Failure Association) website heartfailurematters.org (HFM website) improves self-care when added to usual care. Finally, we aim to evaluate the cost-effectiveness of these interventions. A three-arm parallel randomized trial will be conducted. Arm 1 consists of usual care; arm 2 consists of usual care plus the HFM website; and arm 3 is the adjusted care pathway with an interactive platform for disease management (e-Vita platform), with a link to the HFM website, which replaces routine consultations with HF nurses at the outpatient clinic. In total, 414 patients managed in 10 Dutch HF outpatient clinics or in general practice will be included and followed for 12 months. Participants are included if they have had an established diagnosis of HF for at least 3 months. The primary outcome is self-care as measured by the European Heart Failure Self-care Behaviour scale (EHFScB scale). Secondary outcomes are quality of life, cardiovascular- and HF-related mortality, hospitalization, and its duration as captured by hospital and general practitioner registries, use of and user satisfaction with the HFM website, and cost-effectiveness. This study will provide important prospective data on the impact and cost-effectiveness of an interactive platform for disease management and the HFM website. unique identifier: NCT01755988. © 2015 The Authors European Journal of Heart Failure © 2015 European Society of Cardiology.
Advancements in Distributed Learning (ADL) Environment in Support of Transformation
2017-01-01
REPORT TR-HFM-212 Advancements in Distributed Learning (ADL) Environment in Support of Transformation (Progrès en apprentissage distribué (ADL) à...l’appui de la transformation ) This report documents the findings of Task Group 212. The primary objective of this Task Group was to explore an agile...STO TECHNICAL REPORT TR-HFM-212 Advancements in Distributed Learning (ADL) Environment in Support of Transformation (Progrès en apprentissage
Repassivation Investigations on Aluminium: Physical Chemistry of the Passive State
NASA Astrophysics Data System (ADS)
Nagy, Tristan Oliver; Weimerskirch, Morris Jhängi Joseph; Pacher, Ulrich; Kautek, Wolfgang
2016-09-01
We show the temporal change in repassivation mechanism as a time-dependent linear combination of a high-field model of oxide growth (HFM) and the point defect model (PDM). The observed switch in transient repassivation current-decrease under potentiostatic control occurs independently of the active electrode size and effective repassivation time for all applied overpotentials. For that, in situ depassivation of plasma electrolytically oxidized (PEO) coatings on aluminium was performed with nanosecond laser pulses at 266 nm and the repassivation current transients were recorded as a function of pulse number. A mathematical model combines the well established theories of oxide-film formation and growth kinetics, giving insight in the non linear transient behaviour of micro-defect passivation. According to our findings, the repassivation process can be described as a charge consumption via two concurrent channels. While the major current-decay at the very beginning of the fast healing oxide follows a point-defect type exponential damping, the HFM mechanism supersedes gradually, the longer the repassivation evolves. Furthermore, the material seems to reminisce former laser treatments via defects built-in during depassivation, leading to a higher charge contribution of the PDM mechanism at higher pulse numbers.
Burton, Kathryn J; Rosikiewicz, Marta; Pimentel, Grégory; Bütikofer, Ueli; von Ah, Ueli; Voirol, Marie-Jeanne; Croxatto, Antony; Aeby, Sébastien; Drai, Jocelyne; McTernan, Philip G; Greub, Gilbert; Pralong, François P; Vergères, Guy; Vionnet, Nathalie
2017-05-01
Probiotic yogurt and milk supplemented with probiotics have been investigated for their role in 'low-grade' inflammation but evidence for their efficacy is inconclusive. This study explores the impact of probiotic yogurt on metabolic and inflammatory biomarkers, with a parallel study of gut microbiota dynamics. The randomised cross-over study was conducted in fourteen healthy, young men to test probiotic yogurt compared with milk acidified with 2 % d-(+)-glucono-δ-lactone during a 2-week intervention (400 g/d). Fasting assessments, a high-fat meal test (HFM) and microbiota analyses were used to assess the intervention effects. Baseline assessments for the HFM were carried out after a run-in during which normal milk was provided. No significant differences in the inflammatory response to the HFM were observed after probiotic yogurt compared with acidified milk intake; however, both products were associated with significant reductions in the inflammatory response to the HFM compared with the baseline tests (assessed by IL6, TNFα and chemokine ligand 5) (P<0·001). These observations were accompanied by significant changes in microbiota taxa, including decreased abundance of Bilophila wadsworthia after acidified milk (log 2-fold-change (FC)=-1·5, P adj=0·05) and probiotic yogurt intake (FC=-1·3, P adj=0·03), increased abundance of Bifidobacterium species after acidified milk intake (FC=1·4, P adj=0·04) and detection of Lactobacillus delbrueckii spp. bulgaricus (FC=7·0, P adj<0·01) and Streptococcus salivarius spp. thermophilus (FC=6·0, P adj<0·01) after probiotic yogurt intake. Probiotic yogurt and acidified milk similarly reduce postprandial inflammation that is associated with a HFM while inducing distinct changes in the gut microbiota of healthy men. These observations could be relevant for dietary treatments that target 'low-grade' inflammation.
Thrush, A B; Antoun, G; Nikpay, M; Patten, D A; DeVlugt, C; Mauger, J-F; Beauchamp, B L; Lau, P; Reshke, R; Doucet, É; Imbeault, P; Boushel, R; Gibbings, D; Hager, J; Valsesia, A; Slack, R S; Al-Dirbashi, O Y; Dent, R; McPherson, R; Harper, M-E
2018-01-01
Background/Objectives: Inter-individual variability in weight loss during obesity treatment is complex and poorly understood. Here we use whole body and tissue approaches to investigate fuel oxidation characteristics in skeletal muscle fibers, cells and distinct circulating protein biomarkers before and after a high fat meal (HFM) challenge in those who lost the most (obese diet-sensitive; ODS) vs the least (obese diet-resistant; ODR) amount of weight in a highly controlled weight management program. Subjects/Methods: In 20 weight stable-matched ODS and ODR women who previously completed a standardized clinical weight loss program, we analyzed whole-body energetics and metabolic parameters in vastus lateralis biopsies and plasma samples that were obtained in the fasting state and 6 h after a defined HFM, equivalent to 35% of total daily energy requirements. Results: At baseline (fasting) and post-HFM, muscle fatty acid oxidation and maximal oxidative phosphorylation were significantly greater in ODS vs ODR, as was reactive oxygen species emission. Plasma proteomics of 1130 proteins pre and 1, 2, 5 and 6 h after the HFM demonstrated distinct group and interaction differences. Group differences identified S-formyl glutathione hydratase, heat shock 70 kDA protein 1A/B (HSP72), and eukaryotic translation initiation factor 5 (eIF5) to be higher in ODS vs ODR. Group-time differences included aryl hydrocarbon interacting protein (AIP), peptidylpropyl isomerase D (PPID) and tyrosine protein-kinase Fgr, which increased in ODR vs ODS over time. HSP72 levels correlated with muscle oxidation and citrate synthase activity. These proteins circulate in exosomes; exosomes isolated from ODS plasma increased resting, leak and maximal respiration rates in C2C12 myotubes by 58%, 21% and 51%, respectively, vs those isolated from ODR plasma. Conclusions: Findings demonstrate distinct muscle metabolism and plasma proteomics in fasting and post-HFM states corresponding in diet-sensitive vs diet-resistant obese women. PMID:29151592
ERIC Educational Resources Information Center
Bojare, Inara; Skrinda, Astrida
2016-01-01
The present study is aimed at creating a holistic fractal model (HFM) of autonomous learning for English acquisition in a blended environment of e-studies in adult non-formal education on the basis of the theories and paradigms of philosophy, psychology and education for sustainable development to promote the development of adult learners'…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-07
... Biologics Evaluation and Research (HFM-49), Food and Drug Administration, 1401 Rockville Pike, suite 200N... Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 51, rm. 6183, Silver... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2009-D-0007...
Mandibular asymmetry and the fourth dimension.
Kaban, Leonard B
2009-03-01
This paper represents more than 30 years of discussion and collaboration with Drs Joseph Murray and John Mulliken in an attempt to understand growth patterns over time (ie, fourth dimension) in patients with hemifacial microsomia (HFM). This is essential for the development of rational treatment protocols for children and adults with jaw asymmetry. Traditionally, HFM was thought of as a unilateral deformity, but it was recognized that 20% to 30% of patients had bilateral abnormalities. However, early descriptions of skeletal correction addressed almost exclusively lengthening of the short (affected) side of the face. Based on longitudinal clinical observations of unoperated HFM patients, we hypothesized that abnormal mandibular growth is the earliest skeletal manifestation and that restricted growth of the mandible plays a pivotal role in progressive distortion of both the ipsilateral and contralateral facial skeleton. This hypothesis explains the progressive nature of the asymmetry in patients with HFM and provides the rationale for surgical lengthening of the mandible in children to prevent end-stage deformity. During the past 30 years, we have learned that this phenomenon of progressive distortion of the adjacent and contralateral facial skeleton occurs with other asymmetric mandibular undergrowth (tumor resection, radiation therapy, or posttraumatic defects) and overgrowth (mandibular condylar hyperplasia) conditions. In this paper, I describe the progression of deformity with time in patients with mandibular asymmetry as a result of undergrowth and overgrowth. Understanding these concepts is critical for the development of rational treatment protocols for adults with end-stage asymmetry and for children to minimize secondary deformity.
Offshore killer whale tracking using multiple hydrophone arrays.
Gassmann, Martin; Henderson, E Elizabeth; Wiggins, Sean M; Roch, Marie A; Hildebrand, John A
2013-11-01
To study delphinid near surface movements and behavior, two L-shaped hydrophone arrays and one vertical hydrophone line array were deployed at shallow depths (<125 m) from the floating instrument platform R/P FLIP, moored northwest of San Clemente Island in the Southern California Bight. A three-dimensional propagation-model based passive acoustic tracking method was developed and used to track a group of five offshore killer whales (Orcinus orca) using their emitted clicks. In addition, killer whale pulsed calls and high-frequency modulated (HFM) signals were localized using other standard techniques. Based on these tracks sound source levels for the killer whales were estimated. The peak to peak source levels for echolocation clicks vary between 170-205 dB re 1 μPa @ 1 m, for HFM calls between 185-193 dB re 1 μPa @ 1 m, and for pulsed calls between 146-158 dB re 1 μPa @ 1 m.
Puvanendran, Velmurugu; Riesen, Guido; Seim, Rudi Ripman; Hagen, Ørjan; Martínez-Llorens, Silvia; Falk-Petersen, Inger-Britt; Fernandes, Jorge M. O.; Jobling, Malcolm
2018-01-01
Diploid and triploid Atlantic salmon, Salmo salar were fed high-protein, phosphorus-rich diets (56–60% protein; ca 18g phosphorus kg-1 diet) whilst being reared at low temperature from start-feeding until parr-smolt transformation. Performances of salmon fed diets based on fish meal (STD) or a mix of fishmeal and hydrolysed fish proteins (HFM) as the major protein sources were compared in terms of mortality, diet digestibility, growth and skeletal deformities. Separate groups of diploids and triploids were reared in triplicate tanks (initially 3000 fish per tank; tank biomass ca. 620 g) from 0–2745 degree-days post-start feeding (ddPSF). Growth metrics (weight, length, condition factor) were recorded at ca. 4 week intervals, external signs of deformities to the operculum, jaws and spinal column were examined in parr sampled at 1390 ddPSF, and external signs of deformity and vertebral anomalies (by radiography) were examined in fish sampled at the end of the trial (2745 ddPSF). The triploid salmon generally had a lower mass per unit length, i.e. lower condition factor, throughout the trial, but this did not seem to reflect any consistent dietary or ploidy effects on either dietary digestibility or the growth of the fish. By the end of the trial fish in all treatment groups had achieved a weight of 50+ g, and had completed the parr-smolt transformation. The triploids had slightly, but significantly, fewer vertebrae (Triploids STD 58.74 ± 0.10; HFM 58.68 ± 0.05) than the diploids (Diploids STD 58.97 ± 0.14; HFM 58.89 ± 0.01), and the incidence of skeletal (vertebral) abnormalities was higher in triploids (Triploids STD 31 ± 0.90%; HFM 15 ± 1.44%) than in diploids (Diploids STD 4 ± 0.80%; HFM 4 ± 0.83%). The HFM diet gave a significant reduction in the numbers of triploid salmon with vertebral anomalies in comparison with the triploids fed the STD diet possibly as a result of differences in phosphorus bioavailability between the two diets. Overall, the incidence of skeletal deformities was lower than reported in previous studies (Diploids 20+%, Triploids 40+%), possibly as a result of the combination of rearing at low-temperature and phosphorus-rich diets being used in the present study. PMID:29566030
Ultra Wideband Wireless Body Area Network for Medical Applications
2010-04-01
gastrointestinal tract. They originally were devised to transmit still images of the digestive tract for subsequent diagnosis and detection of gastrointestinal...considered nondispersive and the skin layer is omitted. As depicted in Figure 6, the model is a semicylinder centred at the origin with radius br . All...Medical Applications RTO-MP-HFM-182 42 - 11 z x y Tumour Fat Skin Chest Figure 8: A Simple Hemispherical Brest Model. Table 2
Cryogenic Design of the New High Field Magnet Test Facility at CERN
NASA Astrophysics Data System (ADS)
Benda, V.; Pirotte, O.; De Rijk, G.; Bajko, M.; Craen, A. Vande; Perret, Ph.; Hanzelka, P.
In the framework of the R&D program related to the Large Hadron Collider (LHC) upgrades, a new High Field Magnet (HFM) vertical test bench is required. This facility located in the SM18 cryogenic test hall shall allow testing of up to 15 tons superconducting magnets with energy up to 10 MJ in a temperature range between 1.9 K and 4.5 K. The article describes the cryogenic architecture to be inserted in the general infrastructure of SM18 including the process and instrumentation diagram, the different operating phases including strategy for magnet cool down and warm up at controlled speed and quench management as well as the design of the main components.
2012-11-01
Bldg. 33 Wright- Patterson AFB, OH 45433 Email: janet.sutton@wpafb.af.mil xviii RTO-TR-HFM-138 RTO-TR-HFM-138 ES - 1 Adaptability in...711 Human Performance Wing/Human Effectiveness, Cognitive Systems Branch Wright- Patterson AFB, OH 45433 USA Tel: 1+ 937.656.4316 Fax: 1...AFRL) 711 Human Performance Wing/Human Effectiveness, Cognitive Systems Branch Wright- Patterson AFB, OH 45433 USA Tel: 1+ 937.785.3165 Fax: 1
Resnick, Cory M; Genuth, Joshua; Calabrese, Carly E; Taghinia, Amir; Labow, Brian I; Padwa, Bonnie L
2018-05-28
Patients with hemifacial microsomia (HFM) and Kaban-Pruzansky type III mandibular deformities require ramus construction with autologous tissue. The free fibula flap, an alternative to the costochondral graft, has favorable characteristics for this construction but may be associated with temporomandibular joint ankylosis. The purposes of this study were to present a series of patients with HFM who underwent free fibula flap ramus construction, to determine the incidence of ankylosis, and to identify perioperative factors associated with ankylosis. We performed a retrospective cohort study of patients with HFM who underwent ramus construction with a free fibula flap at Boston Children's Hospital from 2003 to 2015. Patients who had at least 1 year of follow-up and complete medical records were included. The predictor variables included demographic information, HFM severity, surgical history, and operative details. The primary outcome variable was the occurrence of ankylosis. Descriptive statistics were calculated, and significance was set at P < .05. We included 8 patients (75% of whom were female patients) in the study sample. Patients underwent construction at a mean age of 11.4 ± 5.9 years (range, 5 to 21 years). In 5 patients (63%), ankylosis developed during the follow-up period of 7.3 ± 4.8 years. The average time from construction to ankylosis was 4.2 ± 3.7 years. The only predictor variable statistically significantly associated with ankylosis was the use of a contralateral releasing osteotomy, which reduced the rate of ankylosis (P = .035). There was a trend toward a younger age in patients in whom ankylosis developed (8.8 ± 2.6 years) compared with those without ankylosis (15.5 ± 8.1 years, P = .392). The free fibula flap can be associated with a high rate of ankylosis when used for ramus construction in patients with HFM. Passive flap insertion and/or use of a contralateral releasing osteotomy may reduce this risk. Copyright © 2018. Published by Elsevier Inc.
Lopes Krüger, Renata; Costa Teixeira, Bruno; Boufleur Farinha, Juliano; Cauduro Oliveira Macedo, Rodrigo; Pinto Boeno, Francesco; Rech, Anderson; Lopez, Pedro; Silveira Pinto, Ronei; Reischak-Oliveira, Alvaro
2016-12-01
The aim of this study was to compare the effects of 2 different exercise intensities on postprandial lipemia, oxidative stress markers, and endothelial function after a high-fat meal (HFM). Eleven young men completed 2-day trials in 3 conditions: rest, moderate-intensity exercise (MI-Exercise) and heavy-intensity exercise (HI-Exercise). Subjects performed an exercise bout or no exercise (Rest) on the evening of day 1. On the morning of day 2, an HFM was provided. Blood was sampled at fasting (0 h) and every hour from 1 to 5 h during the postprandial period for triacylglycerol (TAG), thiobarbituric acid reactive substance (TBARS), and nitrite/nitrate (NOx) concentrations. Flow-mediated dilatation (FMD) was also analyzed. TAG concentrations were reduced in exercise conditions compared with Rest during the postprandial period (P < 0.004). TAG incremental area under the curve (iAUC) was smaller after HI-Exercise compared with Rest (P = 0.012). TBARS concentrations were reduced in MI-Exercise compared with Rest (P < 0.041). FMD was higher in exercise conditions than Rest at 0 h (P < 0.02) and NOx concentrations were enhanced in MI-Exercise compared with Rest at 0 h (P < 0.01). These results suggest that acute exercise can reduce lipemia after an HFM. However, HI-Exercise showed to be more effective in reducing iAUC TAG, which might suggest higher protection against postprandial TAG enhancement. Conversely, MI-Exercise can be beneficial to attenuate the susceptibility of oxidative damage induced by an HFM and to increase endothelial function in the fasted state compared with Rest.
2016-12-01
collaborative effort is addressed by six Technical Panels who manage a wide range of scientific research activities, a Group specialising in modelling and...HFM Human Factors and Medicine Panel • IST Information Systems Technology Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis...and Studies Panel • SCI Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These Panels and Group are the
Jansen, J.; De Napoli, I. E; Fedecostante, M.; Schophuizen, C. M. S.; Chevtchik, N. V.; Wilmer, M. J.; van Asbeck, A. H.; Croes, H. J.; Pertijs, J. C.; Wetzels, J. F. M.; Hilbrands, L. B.; van den Heuvel, L. P.; Hoenderop, J. G.; Stamatialis, D.; Masereeuw, R.
2015-01-01
The bioartificial kidney (BAK) aims at improving dialysis by developing ‘living membranes’ for cells-aided removal of uremic metabolites. Here, unique human conditionally immortalized proximal tubule epithelial cell (ciPTEC) monolayers were cultured on biofunctionalized MicroPES (polyethersulfone) hollow fiber membranes (HFM) and functionally tested using microfluidics. Tight monolayer formation was demonstrated by abundant zonula occludens-1 (ZO-1) protein expression along the tight junctions of matured ciPTEC on HFM. A clear barrier function of the monolayer was confirmed by limited diffusion of FITC-inulin. The activity of the organic cation transporter 2 (OCT2) in ciPTEC was evaluated in real-time using a perfusion system by confocal microscopy using 4-(4-(dimethylamino)styryl)-N-methylpyridinium iodide (ASP+) as a fluorescent substrate. Initial ASP+ uptake was inhibited by a cationic uremic metabolites mixture and by the histamine H2-receptor antagonist, cimetidine. In conclusion, a ‘living membrane’ of renal epithelial cells on MicroPES HFM with demonstrated active organic cation transport was successfully established as a first step in BAK engineering. PMID:26567716
Ye, Sang-Ho; Arazawa, David T.; Zhu, Yang; Shankarraman, Venkat; Malkin, Alexander D.; Kimmel, Jeremy D.; Gamble, Lara J.; Ishihara, Kazuhiko; Federspiel, William J.; Wagner, William R.
2015-01-01
Respiratory assist devices seek optimized performance in terms of gas transfer efficiency and thromboresistance to minimize device size and reduce complications associated with inadequate blood biocompatibility. The exchange of gas with blood occurs at the surface of the hollow fiber membranes (HFMs) used in these devices. In this study, three zwitterionic macromolecules were attached to HFM surfaces to putatively improve thromboresistance: (1) carboxyl-functionalized zwitterionic phosphorylcholine (PC) and (2) sulfobetaine (SB) macromolecules (mPC or mSB-COOH) prepared by a simple thiol-ene radical polymerization and (3) a low-molecular weight sulfobetaine (SB)-co-methacrylic acid (MA) block copolymer (SBMAb-COOH) prepared by reversible addition–fragmentation chain transfer (RAFT) polymerization. Each macromolecule type was covalently immobilized on an aminated commercial HFM (Celg-A) by a condensation reaction, and HFM surface composition changes were analyzed by X-ray photoelectron spectroscopy. Thrombotic deposition on the HFMs was investigated after contact with ovine blood in vitro. The removal of CO2 by the HFMs was also evaluated using a model respiratory assistance device. The HFMs conjugated with zwitterionic macromolecules (Celg-mPC, Celg-mSB, and Celg-SBMAb) showed expected increases in phosphorus or sulfur surface content. Celg-mPC and Celg-SBMAb experienced rates of platelet deposition significantly lower than those of unmodified (Celg-A, >95% reduction) and heparin-coated (>88% reduction) control HFMs. Smaller reductions were seen with Celg-mSB. The CO2 removal rate for Celg-SBMAb HFMs remained comparable to that of Celg-A. In contrast, the rate of removal of CO2 for heparin-coated HFMs was significantly reduced. The results demonstrate a promising approach to modifying HFMs using zwitterionic macromolecules for artificial lung devices with improved thromboresistance without degradation of gas transfer. PMID:25669307
Watson, Dennis P; Young, Jeani; Ahonen, Emily; Xu, Huiping; Henderson, Macey; Shuman, Valery; Tolliver, Randi
2014-10-17
There is currently a lack of scientifically designed and tested implementation strategies. Such strategies are particularly important for highly complex interventions that require coordination between multiple parts to be successful. This paper presents a protocol for the development and testing of an implementation strategy for a complex intervention known as the Housing First model (HFM). Housing First is an evidence-based practice for chronically homeless individuals demonstrated to significantly improve a number of outcomes. Drawing on practices demonstrated to be useful in implementation and e-learning theory, our team is currently adapting a face-to-face implementation strategy so that it can be delivered over a distance. Research activities will be divided between Chicago and Central Indiana, two areas with significantly different barriers to HFM implementation. Ten housing providers (five from Chicago and five from Indiana) will be recruited to conduct an alpha test of each of four e-learning modules as they are developed. Providers will be requested to keep a detailed log of their experience completing the modules and participate in one of two focus groups. After refining the modules based on alpha test results, we will test the strategy among a sample of four housing organizations (two from Chicago and two from Indiana). We will collect and analyze both qualitative and quantitative data from administration and staff. Measures of interest include causal factors affecting implementation, training outcomes, and implementation outcomes. This project is an important first step in the development of an evidence-based implementation strategy to increase scalability and impact of the HFM. The project also has strong potential to increase limited scientific knowledge regarding implementation strategies in general.
2017-12-01
settings – be it field trials, field experiments, tests or evaluations . This guide is based on experience of the the NATO STO Task Group (HFM-211...practice by the Task Group. Feedback from defence colleges and Training and Evaluation staffs is highly welcomed. ES - 2 STO-TR-HFM-211 Une... evaluation programs have been setup to analyse and demonstrate the added value and effectiveness of new developments such as new operational concepts, new
Wagenaar, Kim P; Rutten, Frans H; Klompstra, Leonie; Bhana, Yusuf; Sieverink, Floor; Ruschitzka, Frank; Seferovic, Petar M; Lainscak, Mitja; Piepoli, Massimo F; Broekhuizen, Berna D L; Strömberg, Anna; Jaarsma, Tiny; Hoes, Arno W; Dickstein, Kenneth
2017-11-01
In 2007, the Heart Failure Association of the European Society of Cardiology (ESC) launched the information website heartfailurematters.org (HFM site) with the aim of creating a practical tool through which to provide advice and guidelines for living with heart failure to patients, their carers, health care professionals and the general public worldwide. The website is managed by the ESC at the European Heart House and is currently available in nine languages. The aim of this study is to describe the background, objectives, use, lessons learned and future directions of the HFM site. Data on the number of visitor sessions on the site as measured by Google Analytics were used to explore use of the HFM site from 2010 to 2015. Worldwide, the annual number of sessions increased from 416 345 in 2010 to 1 636 368 in 2015. Most users (72-75%) found the site by using a search engine. Desktops and, more recently, smartphones were used to visit the website, accounting for 50% and 38%, respectively, of visits to the site in 2015. Although its use has increased, the HFM site has not yet reached its full potential: fewer than 2 million users have visited the website, whereas the number of people living with heart failure worldwide is estimated to be 23 million. Uptake and use could be further improved by a continuous process of qualitative assessment of users' preferences, and the provision of professional helpdesk facilities, comprehensive information technology, and promotional support. © 2017 The Authors. European Journal of Heart Failure © 2017 European Society of Cardiology.
Yin, Hong-Yu; Wang, Chuan; Zhang, Zhi-Yong; Shi, Lei; Yin, Lin; Liu, Wei; Feng, Shi; Cao, Yi-Lin; Tang, Xiao-Jun
2018-06-11
The relapse of hemifacial microsomia was thought to be highly related to the soft tissue envelope around the mandible angle mainly composed by masseter and medial pterygoid. According to the reason, we tried to apply masseter injection of type A botulinum toxin to weaken the soft envelope tension on the early stage post mandible distraction in adult HFM patients. Eight patients diagnosed with HFM were studied and randomly assigned to an experimental or control group. Patients in the experimental group were treated with DO, orthognathic surgeries, autologous fat grafting, and bilateral masseter muscle injection with type A botulinum toxin. The patients in control group were treated with the same procedures as the patients in experimental group except for masseter muscle injection with type A botulinum toxin. The recurrence rates of both groups were evaluated and analyzed after nearly 1 year of follow-up. The mean recurrence rate was 26.30% ± 11.84% (range 7.62%-37.27%) in the 8 patients after 1-year follow-up. The relapse rate was 16.32% ± 7.78% (7.62%-26.22%) in the experimental group and 36.28% ± 1.03% (34.84%-37.27%) in the control group. There was a significant difference (P = 0.002) between the experimental group and the control group. The combination of DO, orthognathic surgeries, autologous fat particle transplantation, and masseter muscle type A botulinum toxin injection technique could be a comprehensive treatment plan for adult patients of HFM. Furthermore, masseter injection of type A botulinum toxin might be an alternative method to reduce the early recurrence rate of postoperative adult patients of HFM.
Belcher, Wayne R.; Sweetkind, Donald S.
2010-01-01
A numerical three-dimensional (3D) transient groundwater flow model of the Death Valley region was developed by the U.S. Geological Survey for the U.S. Department of Energy programs at the Nevada Test Site and at Yucca Mountain, Nevada. Decades of study of aspects of the groundwater flow system and previous less extensive groundwater flow models were incorporated and reevaluated together with new data to provide greater detail for the complex, digital model. A 3D digital hydrogeologic framework model (HFM) was developed from digital elevation models, geologic maps, borehole information, geologic and hydrogeologic cross sections, and other 3D models to represent the geometry of the hydrogeologic units (HGUs). Structural features, such as faults and fractures, that affect groundwater flow also were added. The HFM represents Precambrian and Paleozoic crystalline and sedimentary rocks, Mesozoic sedimentary rocks, Mesozoic to Cenozoic intrusive rocks, Cenozoic volcanic tuffs and lavas, and late Cenozoic sedimentary deposits of the Death Valley regional groundwater flow system (DVRFS) region in 27 HGUs. Information from a series of investigations was compiled to conceptualize and quantify hydrologic components of the groundwater flow system within the DVRFS model domain and to provide hydraulic-property and head-observation data used in the calibration of the transient-flow model. These studies reevaluated natural groundwater discharge occurring through evapotranspiration (ET) and spring flow; the history of groundwater pumping from 1913 through 1998; groundwater recharge simulated as net infiltration; model boundary inflows and outflows based on regional hydraulic gradients and water budgets of surrounding areas; hydraulic conductivity and its relation to depth; and water levels appropriate for regional simulation of prepumped and pumped conditions within the DVRFS model domain. Simulation results appropriate for the regional extent and scale of the model were provided by acquiring additional data, by reevaluating existing data using current technology and concepts, and by refining earlier interpretations to reflect the current understanding of the regional groundwater flow system. Groundwater flow in the Death Valley region is composed of several interconnected, complex groundwater flow systems. Groundwater flow occurs in three subregions in relatively shallow and localized flow paths that are superimposed on deeper, regional flow paths. Regional groundwater flow is predominantly through a thick Paleozoic carbonate rock sequence affected by complex geologic structures from regional faulting and fracturing that can enhance or impede flow. Spring flow and ET are the dominant natural groundwater discharge processes. Groundwater also is withdrawn for agricultural, commercial, and domestic uses. Groundwater flow in the DVRFS was simulated using MODFLOW-2000, the U.S. Geological Survey 3D finitedifference modular groundwater flow modeling code that incorporates a nonlinear least-squares regression technique to estimate aquifer parameters. The DVRFS model has 16 layers of defined thickness, a finite-difference grid consisting of 194 rows and 160 columns, and uniform cells 1,500 meters (m) on each side. Prepumping conditions (before 1913) were used as the initial conditions for the transient-state calibration. The model uses annual stress periods with discrete recharge and discharge components. Recharge occurs mostly from infiltration of precipitation and runoff on high mountain ranges and from a small amount of underflow from adjacent basins. Discharge occurs primarily through ET and spring discharge (both simulated as drains) and water withdrawal by pumping and, to a lesser amount, by underflow to adjacent basins simulated by constant-head boundaries. All parameter values estimated by the regression are reasonable and within the range of expected values. The simulated hydraulic heads of the final calibrated transient mode
Cole, Jeffrey C.; Maloney, Kelly O.; Schmid, Matthias; McKenna, James E.
2014-01-01
Water temperature is an important driver of many processes in riverine ecosystems. If reservoirs are present, their releases can greatly influence downstream water temperatures. Models are important tools in understanding the influence these releases may have on the thermal regimes of downstream rivers. In this study, we developed and tested a suite of models to predict river temperature at a location downstream of two reservoirs in the Upper Delaware River (USA), a section of river that is managed to support a world-class coldwater fishery. Three empirical models were tested, including a Generalized Least Squares Model with a cosine trend (GLScos), AutoRegressive Integrated Moving Average (ARIMA), and Artificial Neural Network (ANN). We also tested one mechanistic Heat Flux Model (HFM) that was based on energy gain and loss. Predictor variables used in model development included climate data (e.g., solar radiation, wind speed, etc.) collected from a nearby weather station and temperature and hydrologic data from upstream U.S. Geological Survey gages. Models were developed with a training dataset that consisted of data from 2008 to 2011; they were then independently validated with a test dataset from 2012. Model accuracy was evaluated using root mean square error (RMSE), Nash Sutcliffe efficiency (NSE), percent bias (PBIAS), and index of agreement (d) statistics. Model forecast success was evaluated using baseline-modified prime index of agreement (md) at the one, three, and five day predictions. All five models accurately predicted daily mean river temperature across the entire training dataset (RMSE = 0.58–1.311, NSE = 0.99–0.97, d = 0.98–0.99); ARIMA was most accurate (RMSE = 0.57, NSE = 0.99), but each model, other than ARIMA, showed short periods of under- or over-predicting observed warmer temperatures. For the training dataset, all models besides ARIMA had overestimation bias (PBIAS = −0.10 to −1.30). Validation analyses showed all models performed well; the HFM model was the most accurate compared other models (RMSE = 0.92, both NSE = 0.98, d = 0.99) and the ARIMA model was least accurate (RMSE = 2.06, NSE = 0.92, d = 0.98); however, all models had an overestimation bias (PBIAS = −4.1 to −10.20). Aside from the one day forecast ARIMA model (md = 0.53), all models forecasted fairly well at the one, three, and five day forecasts (md = 0.77–0.96). Overall, we were successful in developing models predicting daily mean temperature across a broad range of temperatures. These models, specifically the GLScos, ANN, and HFM, may serve as important tools for predicting conditions and managing thermal releases in regulated river systems such as the Delaware River. Further model development may be important in customizing predictions for particular biological or ecological needs, or for particular temporal or spatial scales.
NASA Astrophysics Data System (ADS)
Cole, Jeffrey C.; Maloney, Kelly O.; Schmid, Matthias; McKenna, James E.
2014-11-01
Water temperature is an important driver of many processes in riverine ecosystems. If reservoirs are present, their releases can greatly influence downstream water temperatures. Models are important tools in understanding the influence these releases may have on the thermal regimes of downstream rivers. In this study, we developed and tested a suite of models to predict river temperature at a location downstream of two reservoirs in the Upper Delaware River (USA), a section of river that is managed to support a world-class coldwater fishery. Three empirical models were tested, including a Generalized Least Squares Model with a cosine trend (GLScos), AutoRegressive Integrated Moving Average (ARIMA), and Artificial Neural Network (ANN). We also tested one mechanistic Heat Flux Model (HFM) that was based on energy gain and loss. Predictor variables used in model development included climate data (e.g., solar radiation, wind speed, etc.) collected from a nearby weather station and temperature and hydrologic data from upstream U.S. Geological Survey gages. Models were developed with a training dataset that consisted of data from 2008 to 2011; they were then independently validated with a test dataset from 2012. Model accuracy was evaluated using root mean square error (RMSE), Nash Sutcliffe efficiency (NSE), percent bias (PBIAS), and index of agreement (d) statistics. Model forecast success was evaluated using baseline-modified prime index of agreement (md) at the one, three, and five day predictions. All five models accurately predicted daily mean river temperature across the entire training dataset (RMSE = 0.58-1.311, NSE = 0.99-0.97, d = 0.98-0.99); ARIMA was most accurate (RMSE = 0.57, NSE = 0.99), but each model, other than ARIMA, showed short periods of under- or over-predicting observed warmer temperatures. For the training dataset, all models besides ARIMA had overestimation bias (PBIAS = -0.10 to -1.30). Validation analyses showed all models performed well; the HFM model was the most accurate compared other models (RMSE = 0.92, both NSE = 0.98, d = 0.99) and the ARIMA model was least accurate (RMSE = 2.06, NSE = 0.92, d = 0.98); however, all models had an overestimation bias (PBIAS = -4.1 to -10.20). Aside from the one day forecast ARIMA model (md = 0.53), all models forecasted fairly well at the one, three, and five day forecasts (md = 0.77-0.96). Overall, we were successful in developing models predicting daily mean temperature across a broad range of temperatures. These models, specifically the GLScos, ANN, and HFM, may serve as important tools for predicting conditions and managing thermal releases in regulated river systems such as the Delaware River. Further model development may be important in customizing predictions for particular biological or ecological needs, or for particular temporal or spatial scales.
Immobilized Carbonic Anhydrase on Hollow Fiber Membranes Accelerates CO2 Removal from Blood
Arazawa, David T.; Oh, Heung-Il; Ye, Sang-Ho; Johnson, Carl A.; Woolley, Joshua R.; Wagner, William R.; Federspiel, William J.
2012-01-01
Current artificial lungs and respiratory assist devices designed for carbon dioxide removal (CO2R) are limited in their efficiency due to the relatively small partial pressure difference across gas exchange membranes. To offset this underlying diffusional challenge, bioactive hollow fiber membranes (HFMs) increase the carbon dioxide diffusional gradient through the immobilized enzyme carbonic anhydrase (CA), which converts bicarbonate to CO2 directly at the HFM surface. In this study, we tested the impact of CA-immobilization on HFM CO2 removal efficiency and thromboresistance in blood. Fiber surface modification with radio frequency glow discharge (RFGD) introduced hydroxyl groups, which were activated by 1M CNBr while 1.5M TEA was added drop wise over the activation time course, then incubation with a CA solution covalently linked the enzyme to the surface. The bioactive HFMs were then potted in a model gas exchange device (0.0084 m2) and tested in a recirculation loop with a CO2 inlet of 50mmHg under steady blood flow. Using an esterase activity assay, CNBr chemistry with TEA resulted in 0.99U of enzyme activity, a 3.3 fold increase in immobilized CA activity compared to our previous method. These bioactive HFMs demonstrated 108 ml/min/m2 CO2 removal rate, marking a 36% increase compared to unmodified HFMs (p < 0.001). Thromboresistance of CA-modified HFMs was assessed in terms of adherent platelets on surfaces by using lactate dehydrogenase (LDH) assay as well as scanning electron microscopy (SEM) analysis. Results indicated HFMs with CA modification had 95% less platelet deposition compared to unmodified HFM (p < 0.01). Overall these findings revealed increased CO2 removal can be realized through bioactive HFMs, enabling a next generation of more efficient CO2 removal intravascular and paracorporeal respiratory assist devices. PMID:22962517
A novel step osteotomy for correction of hemifacial microsomia - A case report.
Howlader, Debraj; Bhutia, Dichen P; Vignesh, U; Mehrotra, Divya
2016-01-01
Facial asymmetry is one of the commonest facial anomalies, with reported incidence as high as 34%. Hemifacial microsomia (HFM) has an incidence of 1 in every 4000-5600 children and is one of the commonest causes of facial asymmetry. The standard treatment of HFM is orthognathic surgery by bilateral saggital split osteotomy (BSSO) or distraction osteogenesis (DO) of the mandible, both of which involve prolonged periods of occlusal adjustments by an orthodontist. Here, we present distraction of the mandible by means of a novel modified step osteotomy to correct the facial asymmetry in a case of hemifacial microsomia without disturbing the occlusion. This novel technique can prove to be a new tool in the maxillofacial surgeons armamentarium to treat facial asymmetry.
Computational Failure Modeling of Accelerative Injuries to the Lower Leg Below the Knee
2013-03-01
accelerations in the lower extremities in the range of 155 to 217 G [Nilakantan and Tabiei, 2009]. During an underbody blast event, within 0.5 ms of the... acceleration and subsequent deformations of the plate which apply significant loads to the soldier’s lower extremities [NATO HFM-090, Task Group 25, 2007... acceleration of the vehicle and the collisions that follow are also a significant source of injury, especially if the soldier is not properly restrained. In
Hemifacial microsomia in cat-eye syndrome: 22q11.1-q11.21 as candidate loci for facial symmetry.
Quintero-Rivera, Fabiola; Martinez-Agosto, Julian A
2013-08-01
Cat-Eye syndrome (CES), (OMIM 115470) also known as chromosome 22 partial tetrasomy or inverted duplicated 22q11, was first reported by Haab [1879] based on the primary features of eye coloboma and anal atresia. However, >60% of the patients lack these primary features. Here, we present a 9-month-old female who at birth was noted to have multiple defects, including facial asymmetry with asymmetric retrognathia, bilateral mandibular hypoplasia, branchial cleft sinus, right-sided muscular torticollis, esotropia, and an atretic right ear canal with low-to-moderate sensorineural hearing loss, bilateral preauricular ear tag/pits, and two skin tags on her left cheek. There were no signs of any colobomas or anal atresia. Hemifacial microsomia (HFM) was suspected clinically. Chromosome studies and FISH identified an extra marker originated from 22q11 consistent with CES, and this was confirmed by aCGH. This report expands the phenotypic variability of CES and includes partial tetrasomy of 22q11.1-q11.21 in the differential diagnosis of HFM. In addition, our case as well as the previous association of 22q11.2 deletions and duplications with facial asymmetry and features of HFM, supports the hypothesis that this chromosome region harbors genes important in the regulation of body plan symmetry, and in particular facial harmony. Copyright © 2013 Wiley Periodicals, Inc.
Belcher, Wayne R.
2004-01-01
A numerical three-dimensional (3D) transient ground-water flow model of the Death Valley region was developed by the U.S. Geological Survey for the U.S. Department of Energy programs at the Nevada Test Site and at Yucca Mountain, Nevada. Decades of study of aspects of the ground-water flow system and previous less extensive ground-water flow models were incorporated and reevaluated together with new data to provide greater detail for the complex, digital model. A 3D digital hydrogeologic framework model (HFM) was developed from digital elevation models, geologic maps, borehole information, geologic and hydrogeologic cross sections, and other 3D models to represent the geometry of the hydrogeologic units (HGUs). Structural features, such as faults and fractures, that affect ground-water flow also were added. The HFM represents Precambrian and Paleozoic crystalline and sedimentary rocks, Mesozoic sedimentary rocks, Mesozoic to Cenozoic intrusive rocks, Cenozoic volcanic tuffs and lavas, and late Cenozoic sedimentary deposits of the Death Valley Regional Ground-Water Flow System (DVRFS) region in 27 HGUs. Information from a series of investigations was compiled to conceptualize and quantify hydrologic components of the ground-water flow system within the DVRFS model domain and to provide hydraulic-property and head-observation data used in the calibration of the transient-flow model. These studies reevaluated natural ground-water discharge occurring through evapotranspiration and spring flow; the history of ground-water pumping from 1913 through 1998; ground-water recharge simulated as net infiltration; model boundary inflows and outflows based on regional hydraulic gradients and water budgets of surrounding areas; hydraulic conductivity and its relation to depth; and water levels appropriate for regional simulation of prepumped and pumped conditions within the DVRFS model domain. Simulation results appropriate for the regional extent and scale of the model were provided by acquiring additional data, by reevaluating existing data using current technology and concepts, and by refining earlier interpretations to reflect the current understanding of the regional ground-water flow system. Ground-water flow in the Death Valley region is composed of several interconnected, complex ground-water flow systems. Ground-water flow occurs in three subregions in relatively shallow and localized flow paths that are superimposed on deeper, regional flow paths. Regional ground-water flow is predominantly through a thick Paleozoic carbonate rock sequence affected by complex geologic structures from regional faulting and fracturing that can enhance or impede flow. Spring flow and evapotranspiration (ET) are the dominant natural ground-water discharge processes. Ground water also is withdrawn for agricultural, commercial, and domestic uses. Ground-water flow in the DVRFS was simulated using MODFLOW-2000, a 3D finite-difference modular ground-water flow modeling code that incorporates a nonlinear least-squares regression technique to estimate aquifer parameters. The DVRFS model has 16 layers of defined thickness, a finite-difference grid consisting of 194 rows and 160 columns, and uniform cells 1,500 m on each side. Prepumping conditions (before 1913) were used as the initial conditions for the transient-state calibration. The model uses annual stress periods with discrete recharge and discharge components. Recharge occurs mostly from infiltration of precipitation and runoff on high mountain ranges and from a small amount of underflow from adjacent basins. Discharge occurs primarily through ET and spring discharge (both simulated as drains) and water withdrawal by pumping and, to a lesser amount, by underflow to adjacent basins, also simulated by drains. All parameter values estimated by the regression are reasonable and within the range of expected values. The simulated hydraulic heads of the final calibrated transient model gener
2010-09-01
what level of detail is needed to build their teams, and they can add more detailed items from the model in order to tap deeper in the performance of...of a project on ‘Command Team Effectiveness’ by Task Group 127 for the RTO Human Factors and Medicine Panel (RTG HFM-127). Published...vérification du modèle et de l’instrument) This Technical Report documents the findings of a project on ‘Command Team Effectiveness’ by Task Group
Luo, Gang; Wang, Wen; Angelidaki, Irini
2013-09-17
Syngas is produced by thermal gasification of both nonrenewable and renewable sources including biomass and coal, and it consists mainly of CO, CO2, and H2. In this paper we aim to bioconvert CO in the syngas to CH4. A novel technology for simultaneous sewage sludge treatment and CO biomethanation in an anaerobic reactor was presented. Batch experiments showed that CO was inhibitory to methanogens, but not to bacteria, at CO partial pressure between 0.25 and 1 atm under thermophilic conditions. During anaerobic digestion of sewage sludge supplemented with CO added through a hollow fiber membrane (HFM) module in continuous thermophilic reactors, CO did not inhibit the process even at a pressure as high as 1.58 atm inside the HFM, due to the low dissolved CO concentration in the liquid. Complete consumption of CO was achieved with CO gas retention time of 0.2 d. Results from high-throughput sequencing analysis showed clear differences of the microbial community structures between the samples from liquid and biofilm on the HFM in the reactor with CO addition. Species close to Methanosarcina barkeri and Methanothermobacter thermautotrophicus were the two main archaeal species involved in CO biomethanation. However, the two species were distributed differently in the liquid phase and in the biofilm. Although the carboxidotrophic activities test showed that CO was converted by both archaea and bacteria, the bacterial species responsible for CO conversion are unknown.
Holographic techniques for cellular fluorescence microscopy
NASA Astrophysics Data System (ADS)
Kim, Myung K.
2017-04-01
We have constructed a prototype instrument for holographic fluorescence microscopy (HFM) based on self-interference incoherent digital holography (SIDH) and demonstrate novel imaging capabilities such as differential 3D fluorescence microscopy and optical sectioning by compressive sensing.
Regulation of DNA conformations and dynamics in flows with hybrid field microfluidics.
Ren, Fangfang; Zu, Yingbo; Kumar Rajagopalan, Kartik; Wang, Shengnian
2012-01-01
Visualizing single DNA dynamics in flow provides a wealth of physical insights in biophysics and complex flow study. However, large signal fluctuations, generated from diversified conformations, deformation history dependent dynamics and flow induced stochastic tumbling, often frustrate its wide adoption in single molecule and polymer flow study. We use a hybrid field microfluidic (HFM) approach, in which an electric field is imposed at desired locations and appropriate moments to balance the flow stress on charged molecules, to effectively regulate the initial conformations and the deformation dynamics of macromolecules in flow. With λ-DNA and a steady laminar shear flow as the model system, we herein studied the performance of HFM on regulating DNA trapping, relaxation, coil-stretch transition, and accumulation. DNA molecules were found to get captured in the focused planes when motions caused by flow, and the electric field were balanced. The trapped macromolecules relaxed in two different routes while eventually became more uniform in size and globule conformations. When removing the electric field, the sudden stretching dynamics of DNA molecules exhibited a more pronounced extension overshoot in their transient response under a true step function of flow stress while similar behaviors to what other pioneering work in steady shear flow. Such regulation strategies could be useful to control the conformations of other important macromolecules (e.g., proteins) and help better reveal their molecular dynamics.
2018-01-01
Medium Access Control (MAC) delay which occurs between the anchor node’s transmissions is one of the error sources in underwater localization. In particular, in AUV localization, the MAC delay significantly degrades the ranging accuracy. The Cramer-Rao Low Bound (CRLB) definition theoretically proves that the MAC delay significantly degrades the localization performance. This paper proposes underwater localization combined with multiple access technology to decouple the localization performance from the MAC delay. Towards this goal, we adopt hyperbolic frequency modulation (HFM) signal that provides multiplexing based on its good property, high-temporal correlation. Owing to the multiplexing ability of the HFM signal, the anchor nodes can transmit packets without MAC delay, i.e., simultaneous transmission is possible. In addition, the simulation results show that the simultaneous transmission is not an optional communication scheme, but essential for the localization of mobile object in underwater. PMID:29373518
Kim, Sungryul; Yoo, Younghwan
2018-01-26
Medium Access Control (MAC) delay which occurs between the anchor node's transmissions is one of the error sources in underwater localization. In particular, in AUV localization, the MAC delay significantly degrades the ranging accuracy. The Cramer-Rao Low Bound (CRLB) definition theoretically proves that the MAC delay significantly degrades the localization performance. This paper proposes underwater localization combined with multiple access technology to decouple the localization performance from the MAC delay. Towards this goal, we adopt hyperbolic frequency modulation (HFM) signal that provides multiplexing based on its good property, high-temporal correlation. Owing to the multiplexing ability of the HFM signal, the anchor nodes can transmit packets without MAC delay, i.e., simultaneous transmission is possible. In addition, the simulation results show that the simultaneous transmission is not an optional communication scheme, but essential for the localization of mobile object in underwater.
21 CFR 600.2 - Mailing addresses.
Code of Federal Regulations, 2010 CFR
2010-04-01
... (HFM-99), Center for Biologics Evaluation and Research, Food and Drug Administration, 1401 Rockville... biological products regulated by the Center for Drug Evaluation and Research (CDER). Unless otherwise stated... Research, Food and Drug Administration, 12229 Wilkins Ave., Rockville, MD 20852. Examples of such...
21 CFR 600.2 - Mailing addresses.
Code of Federal Regulations, 2011 CFR
2011-04-01
... (HFM-99), Center for Biologics Evaluation and Research, Food and Drug Administration, 1401 Rockville... biological products regulated by the Center for Drug Evaluation and Research (CDER). Unless otherwise stated... Research, Food and Drug Administration, 12229 Wilkins Ave., Rockville, MD 20852. Examples of such...
BIOWINOL TECHNOLOGIES: A HYBRID GREEN PROCESS FOR BIOFUEL PRODUCTION – PHASE 2
The development of hollow fiber membrane (HFM) reactor will result in improved gas utilization that will positively impact overall process efficiencies. Successful completion of this project could result in the development of many decentralized biofuel production systems near ...
CD36 Modulates Fasting and Preabsorptive Hormone and Bile Acid Levels.
Shibao, Cyndya A; Celedonio, Jorge E; Tamboli, Robyn; Sidani, Reem; Love-Gregory, Latisha; Pietka, Terri; Xiong, Yanhua; Wei, Yan; Abumrad, Naji N; Abumrad, Nada A; Flynn, Charles Robb
2018-05-01
Abnormal fatty acid (FA) metabolism contributes to diabetes and cardiovascular disease. The FA receptor CD36 has been linked to risk of metabolic syndrome. In rodents CD36 regulates various aspects of fat metabolism, but whether it has similar actions in humans is unknown. We examined the impact of a coding single-nucleotide polymorphism in CD36 on postprandial hormone and bile acid (BA) responses. To examine whether the minor allele (G) of coding CD36 variant rs3211938 (G/T), which reduces CD36 level by ∼50%, influences hormonal responses to a high-fat meal (HFM). Obese African American (AA) women carriers of the G allele of rs3211938 (G/T) and weight-matched noncarriers (T/T) were studied before and after a HFM. Two-center study. Obese AA women. HFM. Early preabsorptive responses (10 minutes) and extended excursions in plasma hormones [C-peptide, insulin, incretins, ghrelin fibroblast growth factor (FGF)19, FGF21], BAs, and serum lipoproteins (chylomicrons, very-low-density lipoprotein) were determined. At fasting, G-allele carriers had significantly reduced cholesterol and glycodeoxycholic acid and consistent but nonsignificant reductions of serum lipoproteins. Levels of GLP-1 and pancreatic polypeptide (PP) were reduced 60% to 70% and those of total BAs were 1.8-fold higher. After the meal, G-allele carriers displayed attenuated early (-10 to 10 minute) responses in insulin, C-peptide, GLP-1, gastric inhibitory peptide, and PP. BAs exhibited divergent trends in G allele carriers vs noncarriers concomitant with differential FGF19 responses. CD36 plays an important role in the preabsorptive hormone and BA responses that coordinate brain and gut regulation of energy metabolism.
21 CFR 312.140 - Address for correspondence.
Code of Federal Regulations, 2012 CFR
2012-04-01
... biological products regulated by CBER. Send the IND submission to the Document Control Center (HFM-99... North VII, 7620 Standish Pl., Rockville, MD 20855. (2) For biological products regulated by CDER. Send the IND submission to the CDER Therapeutic Biological Products Document Room, Center for Drug...
21 CFR 312.140 - Address for correspondence.
Code of Federal Regulations, 2013 CFR
2013-04-01
... biological products regulated by CBER. Send the IND submission to the Document Control Center (HFM-99... North VII, 7620 Standish Pl., Rockville, MD 20855. (2) For biological products regulated by CDER. Send the IND submission to the CDER Therapeutic Biological Products Document Room, Center for Drug...
Regenerative Medicine at Early Echelons: Changing Medical Care & Outcomes
2010-04-01
HFM-182 combat lifesaver. First aid includes tourniquet application, fracture stabilization with splints , and application of sterile dressings to...dysfunction and remodeling after myocardial infarction. Stem Cells. 2008; 26:1646-1655. PMID: 18420834. [72] Shin D.M., Zuba-Surma E.K., Wu W
Time Sensitive Course of Action Development and Evaluation
2010-10-01
Applications militaires de la modelisation humaine ). RTO-MP-HFM-202 14. ABSTRACT The development of courses of action that integrate military with...routes between the capital town C of the province and a neighboring country M. Both roads are historically significant smuggling routes. There were
77 FR 7588 - Blood Products Advisory Committee; Cancellation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-13
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-0001] Blood Products Advisory Committee; Cancellation AGENCY: Food and Drug Administration, HHS. ACTION... Research (HFM-71), Food and Drug Administration, 1401 Rockville Pike, Rockville, MD 20852, Contact 1- 301...
MICROBIAL COMETABOLISM OF RECALCITRANT CHEMICALS IN CONTAMINATED AIR STREAMS
Chlorinated Solvents: The treatment system consists of a laboratory-scale hollow fiber membrane (HFM) module containing a center baffle and a radial cross-flow pattern on the shell side of the fibers. The shell and lumen fluids are contacting in a counter-current f...
21 CFR 814.104 - Original applications.
Code of Federal Regulations, 2011 CFR
2011-04-01
... information to the Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and... Control Room, Center for Drug Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd... or diagnose the disease or condition. The application also shall contain a discussion of the risks...
21 CFR 814.104 - Original applications.
Code of Federal Regulations, 2012 CFR
2012-04-01
... information to the Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and... Control Room, Center for Drug Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd... or diagnose the disease or condition. The application also shall contain a discussion of the risks...
21 CFR 814.104 - Original applications.
Code of Federal Regulations, 2013 CFR
2013-04-01
... information to the Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and... Control Room, Center for Drug Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd... or diagnose the disease or condition. The application also shall contain a discussion of the risks...
21 CFR 600.80 - Postmarketing reporting of adverse experiences.
Code of Federal Regulations, 2011 CFR
2011-04-01
... epidemiological/surveillance studies, reports in the scientific literature, and unpublished scientific papers... products to the Center for Biologics Evaluation and Research (HFM-210), or to the Center for Drug Evaluation and Research (see mailing addresses in § 600.2). Submit all vaccine adverse experience reports to...
21 CFR 600.80 - Postmarketing reporting of adverse experiences.
Code of Federal Regulations, 2012 CFR
2012-04-01
... epidemiological/surveillance studies, reports in the scientific literature, and unpublished scientific papers... products to the Center for Biologics Evaluation and Research (HFM-210), or to the Center for Drug Evaluation and Research (see mailing addresses in § 600.2). Submit all vaccine adverse experience reports to...
21 CFR 600.80 - Postmarketing reporting of adverse experiences.
Code of Federal Regulations, 2014 CFR
2014-04-01
... epidemiological/surveillance studies, reports in the scientific literature, and unpublished scientific papers... products to the Center for Biologics Evaluation and Research (HFM-210), or to the Center for Drug Evaluation and Research (see mailing addresses in § 600.2). Submit all vaccine adverse experience reports to...
21 CFR 600.80 - Postmarketing reporting of adverse experiences.
Code of Federal Regulations, 2013 CFR
2013-04-01
... epidemiological/surveillance studies, reports in the scientific literature, and unpublished scientific papers... products to the Center for Biologics Evaluation and Research (HFM-210), or to the Center for Drug Evaluation and Research (see mailing addresses in § 600.2). Submit all vaccine adverse experience reports to...
75 FR 72834 - Blood Products Advisory Committee; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-26
... Pearline Muckelvene, Center for Biologics Evaluation and Research, Food and Drug Administration (HFM- 71... following topics: (1) November 4 and 5, 2010, meeting of the Health and Human Services Advisory Committee on... (3) Research programs in the Laboratories of Hemostasis and Plasma Derivatives, Division of...
Monitoring Physical and Cognitive Performance During Sustained Military Operations
2009-10-01
performances humaines dans les operations militaires de l’OTAN (Science, Technologie et Ethique)). RTO Human Factors and Medicine Panel (HFM) Symposium...Three levels of difficulty (Zero-back, One-back, Two-back) were included. The participants were presented with a series of capital letters on their
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-30
... Quantitative Risk Assessment: Blood Safety and Availability; Public Workshops AGENCY: Food and Drug... and Transplantation Safety'' (EID public workshop) and ``Quantitative Risk Assessment: Blood Safety... Research (HFM-302), Food and Drug Administration, 1401 Rockville Pike, Suite 550N, Rockville, MD 20852-1448...
Exercise and postprandial lipemia: effects on vascular health in inactive adults.
Ramírez-Vélez, Robinson; Correa-Rodríguez, María; Tordecilla-Sanders, Alejandra; Aya-Aldana, Viviana; Izquierdo, Mikel; Correa-Bautista, Jorge Enrique; Álvarez, Cristian; Garcia-Hermoso, Antonio
2018-04-03
There is evidence to suggest that postprandial lipemia are is linked to the impairment of endothelial function, which is characterized by an imbalance between the actions of vasodilators and vasoconstrictors. The aim of this study was to determine the effects of a 12-week high-intensity training (HIT) and moderate continuous training (MCT) protocol on postprandial lipemia, vascular function and arterial stiffness in inactive adults after high-fat meal (HFM) ingestion. A randomized clinical trial was conducted in 20 healthy, inactive adults (31.6 ± 7.1 years). Participants followed the two exercise protocols for 12 weeks. To induce a state of postprandial lipemia (PPL), all subjects received a HFM. Endothelial function was measured using flow-mediated vasodilation (FMD), normalized brachial artery FMD (nFMD), aortic pulse wave velocity (PWV) and augmentation index (AIx). Plasma total cholesterol, high-density lipoprotein cholesterol (HDL-c), triglycerides and glucose were also measured. The effects of a HFM were evaluated in a fasted state and 60, 120, 180, and 240 min postprandially. A significant decrease in serum glucose between 0 min (fasted state) and 120 min postprandially was found in the HIT group (P = 0.035). Likewise, FMD (%) was significantly different between the fasted state and 60 min after a HFM in the HIT group (P = 0.042). The total cholesterol response expressed as area under curve (AUC) (0-240) was lower following HIT than following MCT, but no significant differences were observed (8%, P > 0.05). Similarly, triglycerides AUC (0-240) was also lower after HIT compared with MCT, which trended towards significance (24%, P = 0.076). The AUC (0-240) for the glucose response was significantly lower following HIT than MCT (10%, P = 0.008). FMD and nFMD AUC (0-240) were significantly higher following HIT than following MCT (46.9%, P = 0.021 and 67.3%, P = 0.009, respectively). PWV AUC (0-240) did not differ following between the two exercise groups (2.3%, P > 0.05). Supervised exercise training mitigates endothelial dysfunction and glucose response induced by PPL. Exercise intensity plays an important role in these protective effects, and medium-term HIT may be more effective than MCT in reducing postprandial glucose levels and attenuating vascular impairment. ClinicalTrials.gov ID: NCT02738385 Date of registration: April 14, 2016.
21 CFR 812.19 - Address for IDE correspondence.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and Drug... Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd., Beltsville, MD 20705-1266. (b... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Address for IDE correspondence. 812.19 Section 812...
50 CFR 218.235 - Requirements for monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Low Frequency Active (SURTASS LFA) Sonar § 218.235 Requirements for monitoring. (a) The Holder of a...) during operations that employ SURTASS LFA sonar in the active mode. The SURTASS vessels shall have... frequency passive SURTASS sonar to listen for vocalizing marine mammals; and (3) Use the HF/M3 active sonar...
50 CFR 218.235 - Requirements for monitoring.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Low Frequency Active (SURTASS LFA) Sonar § 218.235 Requirements for monitoring. (a) The Holder of a...) during operations that employ SURTASS LFA sonar in the active mode. The SURTASS vessels shall have... frequency passive SURTASS sonar to listen for vocalizing marine mammals; and (3) Use the HF/M3 active sonar...
50 CFR 218.235 - Requirements for monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Low Frequency Active (SURTASS LFA) Sonar § 218.235 Requirements for monitoring. (a) The Holder of a...) during operations that employ SURTASS LFA sonar in the active mode. The SURTASS vessels shall have... frequency passive SURTASS sonar to listen for vocalizing marine mammals; and (3) Use the HF/M3 active sonar...
An Approach to Embedded Training for Future Leaders and Staff
2009-10-01
13. SUPPLEMENTARY NOTES See also ADA562526. RTO-MP-HFM-169 Human Dimensions in Embedded Virtual Simulation (Les dimensions humaines dans la...order to better capitalize on follow-on operations. 4.10 Theme 7: Sustain Unit Operations Theme 7 is defined as the ability of Soldiers and
21 CFR 812.19 - Address for IDE correspondence.
Code of Federal Regulations, 2011 CFR
2011-04-01
... and Radiological Health, Document Mail Center, 10903 New Hampshire Ave., Bldg. 66, rm. G609, Silver Spring, MD 20993-0002. (2) For devices regulated by the Center for Biologics Evaluation and Research, send it to the Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and...
21 CFR 203.12 - An appeal from an adverse decision by the district office.
Code of Federal Regulations, 2010 CFR
2010-04-01
... to the Office of Compliance, Center for Drug Evaluation and Research, Food and Drug Administration... and Biologics Quality (HFM-600), Center for Biologics Evaluation and Research, Food and Drug... and Research, Food and Drug Administration, 10903 New Hampshire Ave., Silver Spring, MD 20993-0002...
77 FR 30887 - Amendments to Sterility Test Requirements for Biological Products; Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-24
... Biological Products; Correction AGENCY: Food and Drug Administration, HHS. ACTION: Final rule, correction... Evaluation and Research (HFM-17), Food and Drug Administration, 1401 Rockville Pike, Suite 200N, Rockville... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 600, 610, and...
21 CFR 807.90 - Format of a premarket notification submission.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and Drug... Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd., Beltsville, MD 20705-1266... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Format of a premarket notification submission. 807...
21 CFR 203.12 - An appeal from an adverse decision by the district office.
Code of Federal Regulations, 2011 CFR
2011-04-01
... to the Office of Compliance, Center for Drug Evaluation and Research, Food and Drug Administration... and Biologics Quality (HFM-600), Center for Biologics Evaluation and Research, Food and Drug... and Research, Food and Drug Administration, 10903 New Hampshire Ave., Silver Spring, MD 20993-0002...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-22
... Selection Process for Nominations for Voting and/or Nonvoting Consumer Representatives on Public Advisory... statements of interest from consumer organizations interested in participating in the selection process and... Evaluation and Vaccines and Related Biological Products. Research, 1401 Rockville Pike (HFM-71), Rockville...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-27
... Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New..., and Development (HFM-40), Center for Biologics Evaluation and Research, Food and Drug Administration..., Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 51...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New... and Development (HFM-40), Center for Biologics Evaluation and Research, Food and Drug Administration..., Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 22...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... Biologics Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, Suite 200N... Person) at least 7 days in advance. SUPPLEMENTARY INFORMATION: Quantitative risk assessments (QRAs) are... and maintaining critical relationships both within the Center for Biologics Evaluation and Research...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... and Development (HFM-40), Center for Biologics Evaluation and Research (CBER), Food and Drug... Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New..., Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 51...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-09
.... This notice is the Agency's report on the status of the studies and clinical trials that applicants... Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 22, Rm... and Research (HFM- 25), Food and Drug Administration, 1400 Rockville Pike, Rockville, MD 20852, 301...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-29
... already published by FDA in 2004 summarizing its survey research results on the public health impacts of... and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 51, rm. 3238, Silver... Ripley, Center for Biologics Evaluation and Research (HFM-17), Food and Drug Administration, 1401...
75 FR 15639 - Revision of the Requirements for Constituent Materials
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... and Research (HFM-17), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville, MD... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 610 [Docket No. FDA-2010-N-0099] RIN 0910-AG15 Revision of the Requirements for Constituent Materials AGENCY: Food and...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... Products Advisory Committee AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food... Biological Products Advisory Committee for the Center for Biologics Evaluation and Research (CBER) notify FDA... Evaluation and Research (HFM-71), Food and Drug Administration, 1401 Rockville Pike, Rockville, MD 20852-1448...
21 CFR 660.36 - Samples and protocols.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples and protocols. 660.36 Section 660.36 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS... Research Sample Custodian (ATTN: HFM-672) (see mailing addresses in § 600.2 of this chapter), within 30...
Wang, Wen; Xie, Li; Luo, Gang; Zhou, Qi; Angelidaki, Irini
2013-10-01
A new method for simultaneous coke oven gas (COG) biomethanation and in situ biogas upgrading in anaerobic reactor was developed in this study. The simulated coke oven gas (SCOG) (92% H2 and 8% CO) was injected directly into the anaerobic reactor treating sewage sludge through hollow fiber membrane (HFM). With pH control at 8.0, the added H2 and CO were fully consumed and no negative effects on the anaerobic degradation of sewage sludge were observed. The maximum CH4 content in the biogas was 99%. The addition of SCOG resulted in enrichment and dominance of homoacetogenetic genus Treponema and hydrogenotrophic genus Methanoculleus in the liquid, which indicated that H2 were converted to methane by both direct (hydrogenotrophic methanogenesis) and indirect (homoacetogenesis+aceticlastic methanogenesis) pathways in the liquid. However, the aceticlasitic genus Methanosaeta was dominant for archaea in the biofilm on the HFM, which indicated indirect (homoacetogenesis+aceticlastic methanogenesis) H2 conversion pathway on the biofilm. Copyright © 2013 Elsevier Ltd. All rights reserved.
Magnetic field experiment for Voyagers 1 and 2
NASA Technical Reports Server (NTRS)
Behannon, K. W.; Aluna, M. H.; Burlaga, L. F.; Lepping, R. P.; Ness, N. F.; Neubauer, F. M.
1977-01-01
The magnetic field experiment to be carried on the Voyager 1 and 2 missions consists of dual low field (LFM) and high field magnetometer (HFM) systems. The dual systems provide greater reliability and, in the case of the LFM's, permit the separation of spacecraft magnetic fields from the ambient fields. Additional reliability is achieved through electronics redundancy. The wide dynamic ranges of plus or minus 0.5G for the LFM's and plus or minus 20G for the HFM's, low quantization uncertainty of plus or minus 0.002 gamma in the most sensitive (plus or minus 8 gamma) LFM range, low sensor RMS noise level of 0.006 gamma, and use of data compaction schemes to optimize the experiment information rate all combine to permit the study of a broad spectrum of phenomena during the mission. Planetary fields at Jupiter, Saturn, and possibly Uranus; satellites of these planets; solar wind and satellite interactions with the planetary fields; and the large-scale structure and microscale characteristics of the interplanetary magnetic field are studied. The interstellar field may also be measured.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-06
.... This notice is the Agency's report on the status of the studies and clinical trials that applicants... Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 22, Rm... and Research (HFM-17), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville, MD...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Manufacturers Assistance (HFM-48), Center for Biologics Evaluation and Research, Food and Drug Administration... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Will establishment registrations and HCT/P....37 Section 1271.37 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN...
77 FR 50121 - Hospira, Inc.; Withdrawal of Approval of a New Drug Application for DEXTRAN 70
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
..., Center for Biologics Evaluation and Research (HFM-17), Food and Drug Administration, 1401 Rockville Pike... and Research, by the Commissioner of Food and Drugs, approval of NDA 080- 819, DEXTRAN 70 [6% Dextran... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-0840...
21 CFR 822.8 - When, where, and how must I submit my postmarket surveillance plan?
Code of Federal Regulations, 2010 CFR
2010-04-01
... submission to the Document Control Center (HFM-99), Center for Biologics Evaluation and Research, Food and... Document Room, Center for Drug Evaluation and Research, Food and Drug Administration, 5901-B Ammendale Rd... 21 Food and Drugs 8 2010-04-01 2010-04-01 false When, where, and how must I submit my postmarket...
Sensitive Equipment Decontamination
2017-10-01
tomorrow’s CBRN threats. The document was elaborated by a group of scientists and CBRN specialists, all being members of the Hazard Management ... Management Panel within NATO Joint CBRN Defence Capability Development Group ii STO-TR-HFM-233 The NATO Science and Technology...accordance with NATO policies. The total spectrum of this collaborative effort is addressed by six Technical Panels who manage a wide range of scientific
2004-09-01
technically in the underdog situation. It obviously has made it crucial to “win the media war”. Media coverage now has a dramatic effect on public opinion...Ever more effective body armour reduces fatal casualty numbers but increases significantly the medical challenge to save the survivor’s limbs. As a
15 Years of R&D on high field accelerator magnets at FNAL
Barzi, Emanuela; Zlobin, Alexander V.
2016-07-01
The High Field Magnet (HFM) Program at Fermi National Accelerator Laboratory (FNAL) has been developing Nb 3Sn superconducting magnets, materials and technologies for present and future particle accelerators since the late 1990s. This paper summarizes the main results of the Nb 3Sn accelerator magnet and superconductor R&D at FNAL and outlines the Program next steps.
Pharmacological Correction of the Human Functional State in High Altitude Conditions
2001-06-01
Operational Medical Issues in Hypo-and Hyperbaric Conditions [les Questions medicales a caractere oprationel liees aux conditions hypobares ou hyperbares ...Cholesterol, Adaptation Paper presented at the RTO HFM Symposium on "Operational Medical Issues in Hypo- and Hyperbaric Conditions", held in Toronto...T.D., 1986, Recovery after Extreme Hypobaric Hypoxia as a Method of Study of Antihypoxic Activity of Chemical Compounds. In: Farmakologicheskaya
2009-10-01
BIO -INSPIRED HUMAN PERFORMANCE ENHANCEMENT 3.1 Biological performance currently outside of the bounds of the human species HPE opportunities may...strategies to preferentially burn fat in weight reduction (85). 3.2 Bio -inspired opportunities for human performance There are many interesting...solutions to assist human performance Nonmedical applications of bio -inspired engineering and computing technologies are a recognized priority in
2012-01-01
RTA HFM-201/RSM PAPER 3 - 1 © 2012 The MITRE Corporation. All Rights Reserved. Social Radar Barry Costa and John Boiney MITRE Corporation...defenders require an integrated set of capabilities that we refer to as a “ social radar.” Such a system would support strategic- to operational-level...situation awareness, alerting, course of action analysis, and measures of effectiveness for each action undertaken. Success of a social radar
21 CFR 1271.22 - How and where do I register and submit an HCT/P list?
Code of Federal Regulations, 2011 CFR
2011-04-01
... may submit Form FDA 3356 to the Center for Biologics Evaluation and Research (HFM-775), Food and Drug... 21 Food and Drugs 8 2011-04-01 2011-04-01 false How and where do I register and submit an HCT/P list? 1271.22 Section 1271.22 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND...
21 CFR 1271.22 - How and where do I register and submit an HCT/P list?
Code of Federal Regulations, 2010 CFR
2010-04-01
... may submit Form FDA 3356 to the Center for Biologics Evaluation and Research (HFM-775), Food and Drug... 21 Food and Drugs 8 2010-04-01 2010-04-01 false How and where do I register and submit an HCT/P list? 1271.22 Section 1271.22 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND...
Designing Efficient and Effective, Operationally Relevant, High Altitude Training Profiles
2001-06-01
Operational Medical Issues in Hypo-and Hyperbaric Conditions [les Questions medicales a caractere oprationel liees aux conditions hypobares ou... hyperbares ] To order the complete compilation report, use: ADA395680 The component part is provided here to allow users access to individually authored...Airforce was felt to meet this need and was recommended. Paper presented at the RTO HFM Symposium on "Operational Medical Issues in Hypo- and Hyperbaric
Neuropsychometric Test in Royal Netherlands Navy Mine-Clearance Divers
2001-06-01
Issues in Hypo-and Hyperbaric Conditions [les Questions medicales a caractere oprationel liees aux conditions hypobares ou hyperbares ] To order the...Digit Memo Sjan Test (F/B DMST-F/B Learnin /memoie Paper presented at the RTO HFM Symposium on "Operational Medical Issues in Hypo- and Hyperbaric ... Hyperbaric Medicine Annual Meeting 1995, Florida, USA. Abstract 46: 35. 6. Baker EL, R Letz, A Fidler. A computer-administered Neurobehavioural Evaluation
A Vision for Future Virtual Training
2006-06-15
Future Virtual Training. In Virtual Media for Military Applications (pp. KN2-1 – KN2-12). Meeting Proceedings RTO-MP-HFM-136, Keynote 2. Neuilly-sur...Spin Out. By 2017 , the FCS program will meet Full Operation Capability (FOC). The force structure of the Army at this time will include two BCTs...training environment, allowing them to meet preparatory training proficiency objectives virtually while minimizing the use of costly live ammunition. In
2017-12-01
Development of Persistent Pain and psychological Morbidity after Motor Vehicle Collision: Integrating the Potential Role of Stress Response Systems... abnormalities of quantitative EEG which suggest a THE LONG-TERM COSTS OF TRAUMATIC STRESS: INTERTWINED PHYSICAL AND PSYCHOLOGICAL CONSEQUENCES STO-TR-HFM...S.A., Clauw, D.J., Abelson, J.L. et al., The development of persistent pain and psychological morbidity after motor vehicle collision: integrating
Modern Initial Management of Severe Limbs Trauma in War Surgery: Orthopaedic Damage Control
2010-04-01
avoid fat embolism , allow an optimal nursing and medical evacuation without any secondary functional consequences [3]. 2.2.1 Indications: The...decrease the risk of fat embolism . Modern Initial Management of Severe Limbs Trauma in War Surgery: “Orthopaedic Damage Control” RTO-MP-HFM-182 17...injuries. Orthopaedic Imperious: Multiple open shaft fractures with blood loss, complex epiphysal fractures requiring a long difficult surgical bloody
Pathogen Inactivated Plasma Concentrated: Preparation and Uses
2004-09-01
REPORT DATE 01 SEP 2004 2 . REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Pathogen Inactivated Plasma Concentrated: Preparation...Concentrated: Preparation and Uses 22 - 2 RTO-MP-HFM-109 Results: Both UVC and ozone yielded a PPV logarithmic reduction factor (LRF) of 6, for a...technology to be marketed; the industry name is Plas+SD [ 2 ]. This process functions by attacking the lipid sheathes that surround enveloped viruses
Neural and Biological Soldier Enhancement: From SciFi to Deployment
2009-10-01
and force, extra- and ultra-sensory perception , side-effect free 72-hours unbowed alertness, or brain-based Report Documentation Page Form...Deployment 33 - 2 RTO-MP-HFM-181 augmented reality perception , become conceivable and increasingly within reach. A lot of these extraordinary...visionary or exotic, might severely impact NATO forces´ future performance. In addition, a shift in society´s perception of the parting rule between human
2005-04-01
Job, asked if the time constant of tinnitus persistence after an acute acoustic exposure can be predicted from data on hearing thresholds and/or from...individuals whose tinnitus subsides before 72 hours, post-exposure, and individuals whose tinnitus persists Technical Evaluation Report T - 8 RTO-MP-HFM-123...induces formation of vasoactive lipid peroxidation products in the cochlea. Brain Research, 878, 163-173. [10] Rao, D. B, Moore, D. R., Reinke, L. A
Luo, Gang; Angelidaki, Irini
2013-04-01
Bubbleless gas transfer through a hollow fiber membrane (HFM) module was used to supply H2 to an anaerobic reactor for in situ biogas upgrading, and it creates a novel system that could achieve a CH4 content higher than 90 % in the biogas. The increase of CH4 content and pH, and the decrease of bicarbonate concentration were related with the increase of the H2 flow rate. The CH4 content increased from 78.4 % to 90.2 % with the increase of the H2 flow rate from 930 to 1,440 ml/(l day), while the pH in the reactor remained below 8.0. An even higher CH4 content (96.1 %) was achieved when the H2 flow rate was increased to 1,760 ml/(l day); however, the pH increased to around 8.3 due to bicarbonate consumption which hampered the anaerobic process. The biofilm formed on the HFM was found not to be beneficial for the process since it increased the resistance of H2 diffusion to the liquid. The study also demonstrated that the biofilm formed on the membrane only contributed 22-36 % to the H2 consumption, while most of the H2 was consumed by the microorganisms in the liquid phase.
The Potential Role of Recombinant Activated Factor VIIa (rFVIIa) in Military Pre-Hospital Setting
2004-09-01
coagulation factors and platelets by crystalloids, colloids, or blood products The severity of dilutional coagulopathy is determined by both volume and...RTO-MP-HFM-109 3 - 1 The Potential Role of Recombinant Activated Factor VIIa (rFVIIa) in Military Pre-Hospital Setting LTC (ret.) Uri...decrease mortality from exsanguinations. Recombinant factor VIIa (rFVIIa) has been shown to overcome a variety of coagulation and platelet disorders
Managing Fatigue in Long Duration Airlift Operations 1994
2001-03-01
Air Force Research Laboratory Brooks Air Force Base, Texas, USA Abstract During September, 1994 the operational tempo for US Air Force C-5 transport...Nandina Terrace, Winter Springs, Fl 32708, USA Paper presented at the RTO HFM Workshop on "The Effect of Prolonged Military Activities in Man...drink. They were drinking lots of coffee and beer and most were eating a gastronomically difficult creation called a jumbo burger. We were given 24 hours
2017-03-01
and head pain is one of the top reasons for using CAM. One practical way to organize components of integrative health is into psychological ...chronic pain and the integrative strategy, the impact of yoga was shown on performance, physiology, psychology and spirituality. Yoga shows positive...REPORT TR-HFM-195 Integrative Medicine Interventions for Military Personnel (Interventions médicales intégrantes à destination du personnel militaire
Helicopter Aircrew Training Using Fused Reality
2006-06-01
PROCESS Blue screening involving human filming usually employs a blue or green backdrop, since skin contains little blue or green hue. These backdrops...Helicopter Aircrew Training Using Fused Reality 27 - 10 RTO-MP-HFM-136 a. b. c. d. e. f. Figure 13: Frames Showing Physical Object ( witch ... filming . However, when a user’s hands disrupt the light from a helmet-mounted light source, the shadows cast onto the distant background are diffuse and
2011-04-01
Military Science (RTO-MP-HFM-207) Executive Summary Blast injury is a significant source of casualties in current NATO operations. The term “blast...toxicologique du souffle incluant les mécanismes de dose (par exemple, normes d’exposition à un tube à choc ), la description des points limites dose
2004-09-01
RTO-MP-HFM-109 41 - 1 Experience and Consequences on the Deployments of the Medical Services of the German Army in Foreign Countries...army medical services lead to new experiences concerning personal, training, preparation, support, equipment and standardisation. The consequences...are not only important for the surgical work but also for anaesthesiology , intensive care, internal medicine and neurology and psychiatry. The
Khosravi, Sanaz; Rahimnejad, Samad; Herault, Mikaël; Fournier, Vincent; Lee, Cho-Rong; Dio Bui, Hien Thi; Jeong, Jun-Bum; Lee, Kyeong-Jun
2015-08-01
This study was conducted to evaluate the supplemental effects of three different types of protein hydrolysates in a low fish meal (FM) diet on growth performance, feed utilization, intestinal morphology, innate immunity and disease resistance of juvenile red sea bream. A FM-based diet was used as a high fish meal diet (HFM) and a low fish meal (LFM) diet was prepared by replacing 50% of FM by soy protein concentrate. Three other diets were prepared by supplementing shrimp, tilapia or krill hydrolysate to the LFM diet (designated as SH, TH and KH, respectively). Triplicate groups of fish (4.9 ± 0.1 g) were fed one of the test diets to apparent satiation twice daily for 13 weeks and then challenged by Edwardsiella tarda. At the end of the feeding trial, significantly (P < 0.05) higher growth performance was obtained in fish fed HFM and hydrolysate treated groups compared to those fed the LFM diet. Significant improvements in feed conversion and protein efficiency ratios were obtained in fish fed the hydrolysates compared to those fed the LFM diet. Significant enhancement in digestibility of protein was found in fish fed SH and KH diets and dry matter digestibility was increased in the group fed SH diet in comparison to LFM group. Fish fed the LFM diet showed significantly higher glucose level than all the other treatments. Whole-body and dorsal muscle compositions were not significantly influenced by dietary treatments. Histological analysis revealed significant reductions in goblet cell numbers and enterocyte length in the proximal intestine of fish fed the LFM diet. Superoxide dismutase activity and total immunoglobulin level were significantly increased in fish fed the diets containing protein hydrolysates compared to the LFM group. Also, significantly higher lysozyme and antiprotease activities were found in fish fed the hydrolysates and HFM diets compared to those offered LFM diet. Fish fed the LFM diet exhibited the lowest disease resistance against E. tarda and dietary inclusion of the hydrolysates resulted in significant enhancement of survival rate. The results of the current study indicated that the inclusion of the tested protein hydrolysates, particularly SH, in a LFM diet can improve growth performance, feed utilization, digestibility, innate immunity and disease resistance of juvenile red sea bream. Copyright © 2015 Elsevier Ltd. All rights reserved.
Isomer Energy Source for Space Propulsion Systems
2004-03-01
1,590 Engine F/W (no shield) 3.4 5.0 20.0 A similar core design replacing the fission fuel with the isomer 178Hfm2 is the starting point for this...particles interact and collide with other atoms in the fuel material, reactor core , or coolant, their energy can be transferred to thermal energy...thrust (44). The program produced several reactors that made it all the way through the testing stages of development . The reactors used uranium-235
Good Practices of End of Deployment Debriefing in the Royal Netherlands Navy
2006-04-01
Stress Disorders ( PTSD ) among the victims of accidents or traumatic events. At the international level, it was recommended that the term ‘debriefing...single session debriefing in the civilian sector does not lead to a decline in the incidence of Post Traumatic Stress Meijer, M.; de Vries, R. (2006...46 - 2 RTO-MP-HFM-134 Disorders ( PTSD ) among the victims of accidents or traumatic events. At the international level, it is even recommended
2008-10-01
INTRODUCTION RTO-TR-HFM-118 1 - 7 1.2.1.2 Acoustic HUD A three-dimensional auditory display presents sound from arbitrary directions spanning a...through the visual channel, use of the auditory channel shortens reaction times and is expected to reduce pilot workload, thus improving the overall...of vestibular stimulation using a rotating chair or suitable disorientation device to provide each student with a personal experience of some of
Measuring Morale within the French Army
2006-04-01
Measuring Morale within the French Army 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...RTO-MP-HFM-134 29 - 1 Measuring Morale within the French Army Commandant Jean Michel FORET EMAT/Centre de Relations Humaines 14 rue Saint...Dominique 00453 Armées FRANCE crh.emat@emat.terre.defense.gouv.fr ABSTRACT The evaluation of the operational capabilitity of the Army passes by
2011-04-01
Third Location Decompression after Operational Deployment 11 - 2 RTO-MP-HFM-205 programs is based upon the literature on combat motivation ...exposure to normal leisure activities and tourism . Massage is another interesting element in the French program. Each soldier receives at least one... gastronomy ; during the French TLD, soldiers were allowed to drink wine or beer with their meal starting at 7pm and bars closed at 1am ultimately. Alcohol
Trauma Induced Pain and Wound Management in Emergency Environment by Low Energy Photonic Therapy
2004-09-01
Technology and Emergency Medical Procedures”, held in St. Pete Beach, USA, 16-18 August 2004, and published in RTO-MP-HFM-109. Report Documentation Page...Casualty Care in Ground-Based Tactical Situations: Trauma Technology and Emergency Medical Procedures (Soins aux blessés au combat dans des situations...tactiques : technologies des traumas et procédures médicales durgence)., The original document contains color images. 14. ABSTRACT 15. SUBJECT TERMS 16
2004-09-01
arterial pressure of 40 mmHg, and then held there until experimental intervention, resuscitation, decompensation or death occurred. The experiment was...regulations relating to animals and experiments involving animals and adheres to principles stated in the Guide for the Care and Use of Laboratory Animals...Rheoencephalography (REG) as a Non-Invasive Monitoring Alternative for the Assessment of Brain Blood Flow P3 - 12 RTO-MP-HFM-109 experiments we
Human-Autonomy Teaming in a Flight Following Task
NASA Technical Reports Server (NTRS)
Shively, Robert J.
2017-01-01
The NATO HFM-247 Working Group is creating a summary report of the group's activities on human-autonomy teaming. This chapter is a summary of our at NASA Ames work toward developing a framework for human-autonomy teaming (HAT) in aviation. The purpose of this project was to demonstrate and evaluate proposed tenets of HAT. The HAT features were derived from three tenets and were built into an automated recommender system on a ground station. These tenets include bi-directional communication, automation transparency, and operator directed interface. This study focused primarily on interactions with one piece of automation, the Autonomous Constrained Flight Planner (ACFP). The ACFP is designed to support rapid diversion decisions for commercial pilots in off-nominal situations. Much effort has gone into enhancing this tool not only in capability but also in transparency. In this study, participants used the ACFP at a ground station designed to aid dispatchers in a flight following role to reroute aircraft in situations such as inclement weather, system failures and medical emergencies. Participants performed this task both with HAT features enabled and without and provided feedback. We examined subjective and behavioral indicators of HAT collaborations using a proof-of-concept demonstration of HAT tenets. The data collected suggest potential advantages and disadvantages of HAT.
Human-Autonomy Teaming: Supporting Dynamically Adjustable Collaboration
NASA Technical Reports Server (NTRS)
Shively, Jay
2017-01-01
This presentation is a technical update for the NATO-STO HFM-247 working group. Our progress on four goals will be discussed. For Goal 1, a conceptual model of HAT is presented. HAT looks to make automation act as more of a teammate, by having it communicate with human operators in a more human, goal-directed, manner which provides transparency into the reasoning behind automated recommendations and actions. This, in turn, permits more trust in the automation when it is appropriate, and less when it is not, allowing a more targeted supervision of automated functions. For Goal 2, we wanted to test these concepts and principles. We present findings from a recent simulation and describe two in progress. Goal 3 was to develop pattern(s) of HAT solution(s). These were originally presented at HCII 2016 and are reviewed. Goal 4 is to develop a re-usable HAT software agent. This is an ongoing effort to be delivered October 2017.
Practitioner-provider joint ventures, the OIG, and the IRS.
MacKelvie, C F; Handler, M S; Sanborn, A B
1992-11-01
Increased forays by hospitals and physicians into joint ventures make both parties subject to a complex network of laws and regulations and to scrutiny by many Federal agencies. Institutional healthcare providers should be aware of the legal pitfalls and exercise extreme caution when creating such arrangements. This is the second of a two-part examination of issues relative to taxes and Medicare and Medicaid fraud and abuse regulations. (Part I was published in the October 1992 issue of HFM.)
French Army - Family Psychological and Social Support
2006-04-01
Major de l’armée de Terre, Bureau condition du personnel 14 rue Saint-Dominique 00453 Armées FRANCE The French Army is permanently engaged on...ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) French Army...RTO-MP-HFM-134 21E - 1 French Army – Family Psychological and Social Support LCL G. BOUILLAUD French Army Staff, Quality Of Life Etat
Crispin: capital requirements and reinsurance protect against insolvency.
Crispin, C
2001-12-01
Charles Crispin is president of Evergreen Re, a managed care consulting firm with expertise in the reinsurance industry. Before Joining Evergreen Re, Crispin served as a consultant to the managed care industry. He is a member of the American Association of Integrated Delivery Systems, Glen Allen, Virginia, and the Provider Excess Loss Association, Princeton, New Jersey. Crispin recently talked with HFM about risk-based capital requirements for health plans and the Impact these solvency guidelines could have on healthcare providers.
1990-06-01
Contamination Marking Set This set is designed for marking areas contaminated with nuclear, biological and chemical (NBC) agents. It consists of a metal ...program, formerly the Heavy Forces Modernization (HFM) program. In March 1990, the Army Systems Acquisition Review Council (ASARC) reviewed the ASM...AUTOKO) prior to initial fielding of MSE in Europe this year, interoperability training was conducted at Grafenwoehr , Germany, from 30 January to 1
Physical Performance Decrements in Military Personnel Wearing Personal Protective Equipment (PPE)
2009-10-01
remaining out of the testing position for more than 5 seconds. For the NeuroCom SOT, subjects were asked to stand on the force plate with the...PPE) P2 - 2 RTO-MP-HFM-181 and without a PPE system of Kevlar® front and back plates and an unlined combat helmet. The average mass of the PPE...in four different postural conditions immediately before and after the treadmill (exercise) test. All participants stood on a Bertec force platform
Hydrogen bonds in concreto and in computro: the sequel
NASA Astrophysics Data System (ADS)
Stouten, Pieter F. W.; Van Eijck, Bouke P.; Kroon, Jan
1991-02-01
In the framework of our comparative research concerning hydrogen bonding in the crystalline and liquid phases we have carried out molecular dynamics (MD) simulations of liquid methanol. Six different rigid three site models are compared. Five of them had been reported in the literature and one (OM2) we developed by a fit to the experimental molar volume, heat of vaporization and neutron weighted radial distribution function. In general the agreement with experiment is satisfactory for the different models. None of the models has an explicit hydrogen bond potential, but five of the six models show a degree of hydrogen bonding comparable to experiments on liquid methanol. The analysis of the simulation hydrogen bonds indicates that there is a distinct preference of the O⋯O axis to lie in the acceptor lone pairs plane, but hardly any for the lone pair directions. Ab initio calculations and crystal structure statistics of OH⋯O hydrogen bonds agree with this observation. The O⋯O hydrogen bond length distributions are similar for most models. The crystal structures show a sharper O⋯O distribution. Explicit introduction of harmonic motion with a quite realistic root mean square amplitude of 0.08 Å to the thermally averaged crystal distribution results in a distribution comparable to OM2 although the maximum of the former is found at shorter distance. On the basis of the analysis of the static properties of all models we conclude that our OM2, Jorgenson's OPLS and Haughney, Ferrario and McDonald's HFM1 models are good candidates for simulations of liquid methanol under isothermal, isochoric conditions. Partly flexible and completely rigid OM2 are simulated at constant pressure and with fixed volume. The flexible simulations give essentially the same (correct) results under both conditions, which is not surprising because the flexible form was fitted under both conditions. Rigid OM2 has a similar potential energy but larger pressure in the isochoric case and larger energy and far larger volume in the isobaric case. Radial distribution functions and hydrogen bond geometries are very similar for all four cases. Only in the case of the osobaric rigid methanol does the volume expansion seem to be accompanied by a slight preference for tetrahedrality around the oxygen atom.
Studies in the development of a bridging device for guiding regenerating axons
NASA Astrophysics Data System (ADS)
Wen, Xuejun
At present there is no clinically effective treatment for injuries or pathological processes that disrupt the continuity of axons in the mature central nervous system. However, a number of studies suggest that a tremendous potential exists for developing therapies. In particular biomaterials in the form of bridging substrates been shown to support at least some level of axonal regeneration across the lesion site, but display a limited capacity for directing axons toward their targets. To influence the directionality of the regeneration process filaments and tubes appear promising but the technology is far from optimized. As a step toward optimization, we investigated various components of a tissue-engineered bridging device consisting of numerous filaments surrounded by a semipermeable biodegradable hollow fiber membrane (HFM). In the first part of the thesis, we studied the influence of filament diameter and various extracellular matrix coatings on neuron regeneration suing a dorsal root ganglion explant model. We found that laminin surface treated filaments that approached the size of spinal axons support significantly longer regenerative outgrowth than similarly treated filaments of larger diameter, and exceed outgrowth distance on similarly sized filaments treated with fibronectin. Such substrates also consistently supported the attachment and alignment of glial cells and directed the outgrowth of regenerating axons along the long axis of the filaments. In the last part of the thesis, biodegradable hollow fiber membranes were fabricated and their physical, chemical and degradation properties were analyzed. We found that it is possible to use phase inversion methods to fabricate hollow fiber membranes of widely varying properties that degrade of the course of several months. We then evaluated the biocompatibility of the new materials after implantation in the CNS using an adult rat model. We found that the implants were well tolerated and elicited a reaction that was similar to nondegradable control implants. In addition, following the disappearance of the implant, a stable space was created at the site of implantation which may find use in other reparative strategies such as engineering a linear tract of oriented nerve cells. Taken together, our studies support animal experiments to examine whether a tissue-engineered bridging device consisting of numerous filaments surrounded by a semipermeable biodegradable hollow fiber membrane (HFM) supports directed regeneration following spinal injury.
2009-10-01
Endurance Performance at 4300 m 7 - 6 RTO-MP-HFM-181 breakfast volunteers were provided with two commercially available energy bars and fruit juice ...food composition = 510 kcal, 14 gm fat, 65 gm carbohydrate, 32 gm protein) at 1 to 2 hrs prior to the beginning of each of the cycle endurance test...JE, Robinson SR, Skrinar GS, Lewis SF and Sawka MN. Intermittent altitude exposures improve muscular performance at 4,300 m. J Appl Physiol 95
Lam, Carolyn S P; Gamble, Greg D; Ling, Lieng H; Sim, David; Leong, Kui Toh Gerard; Yeo, Poh Shuan Daniel; Ong, Hean Yee; Jaufeerally, Fazlur; Ng, Tze P; Cameron, Vicky A; Poppe, Katrina; Lund, Mayanna; Devlin, Gerry; Troughton, Richard; Richards, A Mark; Doughty, Robert N
2018-05-21
Whether prevalence and mortality of patients with heart failure with preserved or mid-range (40-49%) ejection fraction (HFpEF and HFmREF) are similar to those of heart failure with reduced ejection fraction (HFrEF), as reported in some epidemiologic studies, remains highly controversial. We determined and compared characteristics and outcomes for patients with HFpEF, HFmREF, and HFrEF in a prospective, international, multi-ethnic population. Prospective multi-centre longitudinal study in New Zealand (NZ) and Singapore. Patients with HF were assessed at baseline and followed over 2 years. The primary outcome was death from any cause. Secondary outcome was death and HF hospitalization. Cox proportional hazards models were used to compare outcomes for patients with HFpEF, HFmrEF, and HFrEF. Of 2039 patients enrolled, 28% had HFpEF, 13% HFmrEF, and 59% HFrEF. Compared with HFrEF, patients with HFpEF were older (62 vs. 72 years), more commonly female (17% vs. 48%), and more likely to have a history of hypertension (61% vs. 78%) but less likely to have coronary artery disease (55% vs. 41%). During 2 years of follow-up, 343 (17%) patients died. Adjusting for age, sex, and clinical risk factors, patients with HFpEF had a lower risk of death compared with those with HFrEF (hazard ratio 0.62, 95% confidence interval 0.46-0.85). Plasma (NT-proBNP) was similarly related to mortality in both HFpEF, HFmrEF, and HFrEF independent of the co-variates listed and of ejection fraction. Results were similar for the composite endpoint of death or HF and were consistent between Singapore and NZ. These prospective multinational data showed that the prevalence of HFpEF within the HF population was lower than HFrEF. Death rate was comparable in HFpEF and HFmrEF and lower than in HFrEF. Plasma levels of NT-proBNP were independently and similarly predictive of death in the three HF phenotypes. Australian New Zealand Clinical Trial Registry (ACTRN12610000374066).
Malkin, Alexander D; Ye, Sang-Ho; Lee, Evan J; Yang, Xiguang; Zhu, Yang; Gamble, Lara J; Federspiel, William J; Wagner, William R
2018-02-09
Respiratory assist devices, that utilize ∼2 m 2 of hollow fiber membranes (HFMs) to achieve desired gas transfer rates, have been limited in their adoption due to such blood biocompatibility limitations. This study reports two techniques for the functionalization and subsequent conjugation of zwitterionic sulfobetaine (SB) block copolymers to polymethylpentene (PMP) HFM surfaces with the intention of reducing thrombus formation in respiratory assist devices. Amine or hydroxyl functionalization of PMP HFMs (PMP-A or PMP-H) was accomplished using plasma-enhanced chemical vapor deposition. The generated functional groups were conjugated to low molecular weight SB block copolymers with N-hydroxysuccinimide ester or siloxane groups (SBNHS or SBNHSi) that were synthesized using reversible addition fragmentation chain transfer polymerization. The modified HFMs (PMP-A-SBNHS or PMP-H-SBNHSi) showed 80-95% reduction in platelet deposition from whole ovine blood, stability under the fluid shear of anticipated operating conditions, and uninhibited gas exchange performance relative to non-modified HFMs (PMP-C). Additionally, the functionalization and SBNHSi conjugation technique was shown to reduce platelet deposition on polycarbonate and poly(vinyl chloride), two other materials commonly found in extracorporeal circuits. The observed thromboresistance and stability of the SB modified surfaces, without degradation of HFM gas transfer performance, indicate that this approach is promising for longer term pre-clinical testing in respiratory assist devices and may ultimately allow for the reduction of anticoagulation levels in patients being supported for extended periods. © 2018 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2018. © 2018 Wiley Periodicals, Inc.
A novel osteogenic distraction device for the transversal correction of temporozygomatic hypoplasia.
Pagnoni, Mario; Fadda, Maria Teresa; Cascone, Piero; Iannetti, Giorgio
2014-07-01
Hemifacial microsomia (HFM) is a congenital disorder characterized by craniofacial malformation of one or both sides of the lower face. Since these anomalies are associated with soft-tissue deficiencies, corrective surgery is often difficult. Bone grafts have typically been used for augmentation, but distraction osteogenesis now offers an alternative for many craniofacial deficiencies, but there are few if any appropriate distraction devices and surgical procedures for the augmentation of craniofacial transversal dimensions. The aim of this study was to evaluate a technique for guided augmentation of craniofacial transversal dimensions through distraction osteogenesis. We tested the efficacy of a prototype distractor, developed in collaboration with Medartis, using cadavers and demonstrated its application for the correction of the transverse dimension of the temporozygomatic region in a patient with Goldenhar syndrome. CT scans showed a 4-mm transverse augmentation of the bony surface after 9 days and a 10-mm increase after 30 days. Upon removal of the distractor (60 days after the first surgery) CT indicated good bony fusion and a stable result in the transverse plane. Six months after removal of the distractor, 3D computed tomography confirmed the success of the transverse augmentation, as it appeared to be stable and reliable. Distraction osteogenesis, using our device, can be used to correct the transverse dimension of the temporozygomatic region in HFM patients. It should also be considered for the correction of residual postsurgical skeletal deficiency due to surgical relapse or deficient growth, and unsatisfactory skeletal contour. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bridge, Michael John
Hollow fiber membrane (HFM) cell encapsulation devices use a semipermeable membrane to physically immunoisolate transplanted secretory cells from host tissues and high molecular weight solutes. Advantages inherent to macroencapsulation technology have led to extensive research towards their utilization for treating a wide range of disorders including a number of neurodegenerative diseases and diabetes. Although feasibility studies have already established the therapeutic potential of macroencapsulation technology, a common observation among these and later studies is diminishing therapeutic efficacy over a span of a few weeks following implantation of devices. Progress towards fulfilling the therapeutic potential of this technology initially recognized by investigators has potentially been hampered by inadequate diffusive transport characterization of membranes employed in studies. In addition, the potential effects of host tissue responses following central nervous system (CNS) implantation of these devices is completely unknown. To address these issues a membrane characterization instrument capable of efficiently characterizing the diffusive and convective transport properties of individual HFM segments, such as they are used in devices, was developed. The instrument was then employed to study the effects of ethanol exposure, a common sterilization method, on PAN-PVC membranes commonly used in CNS implantation macro encapsulation device studies. Lastly, the solute diffusivity properties of tissue that forms adjacent to the membranes of brain implanted transcranial access devices were investigated. Coinciding with this investigation was the development of a novel technique for examining the solute diffusivity properties in the extracellular spaces of CNS tissue.
2013-12-01
de la performance de leurs troupes; b) d’importants écarts scientifiques / de capacités existent en ce qui concerne la susceptibilité, le... La HFM/RTG-187 avait pour but de : a) comparer et évaluer les ressources techniques qui sont actuellement utilisées pour un « système de gestion des ...d’une doctrine et de programmes d’enseignement intégrés pour la gestion des risques thermiques afin de garantir le maintien
Microfabrication of hybrid fluid membrane for microengines
NASA Astrophysics Data System (ADS)
Chutani, R.; Formosa, F.; de Labachelerie, M.; Badel, A.; Lanzetta, F.
2015-12-01
This paper describes the microfabrication and dynamic characterization of thick membranes providing a technological solution for microengines. The studied membranes are called hybrid fluid-membrane (HFM) and consist of two thin membranes that encapsulate an incompressible fluid. This work details the microelectromechanical system (MEMS) scalable fabrication and characterization of HFMs. The membranes are composite structures based on Silicon spiral springs embedded in a polymer (RTV silicone). The anodic bonding of multiple stacks of Si/glass structures, the fluid filling and the sealing have been demonstrated. Various HFMs were successfully fabricated and their dynamic characterization demonstrates the agreement between experimental and theoretical results.
2007-04-01
1 Chapter 1 – Introduction 1 - 1 1.1 Background and Problem Definition 1 - 1 1.1.1...Background 1 - 1 1.1.2 Problem Definition 1 -2 1.2 The Objective and Approach of the HFM-090/TG-25 1 -2 1.2.1 Objective 1 -2 1.2.2 Approach 1 -2 1.3...Organization of this Report 1 -3 1.4 References 1 -3 Chapter 2 – The Mine Detonation Process and Occupant Loading 2- 1 2.1 Introduction to Mines 2- 1 2.2
Tanaka, Yoshiyuki; Mizoe, Genki; Kawaguchi, Tomohiro
2015-01-01
This paper proposes a simple diagnostic methodology for checking the ability of proprioceptive/kinesthetic sensation by using a robotic device. The perception ability of virtual frictional forces is examined in operations of the robotic device by the hand at a uniform slow velocity along the virtual straight/circular path. Experimental results by healthy subjects demonstrate that percentage of correct answers for the designed perceptual tests changes in the motion direction as well as the arm configuration and the HFM (human force manipulability) measure. It can be supposed that the proposed methodology can be applied into the early detection of neuromuscular/neurological disorders.
2005-04-01
alain.leger@fr.thalesgroup.com THALES Aerospace Rue Toussaint Catros 33187 Le Haillan FRANCE RESUME Les coques des écouteurs du casque Topowl ont...déjà fait l’objet d’une étude visant à optimiser leur protection auditive dans les stricts budgets de masse et de volume impartis. La présente...techniques audio (p. 17-1 – 17-14). Compte rendu de réunion RTO-MP-HFM-123, Communication 17. Neuilly-sur-Seine, France : RTO. Disponible sur le site
Objectives and Design of the Hemodialysis Fistula Maturation Study
Dember, Laura M.; Imrey, Peter B.; Beck, Gerald J.; Cheung, Alfred K.; Himmelfarb, Jonathan; Huber, Thomas S.; Kusek, John W.; Roy-Chaudhury, Prabir; Vazquez, Miguel A.; Alpers, Charles E.; Robbin, Michelle L.; Vita, Joseph A.; Greene, Tom; Gassman, Jennifer J.; Feldman, Harold I.
2014-01-01
Background A large proportion of newly created arteriovenous fistulas cannot be used for dialysis because they fail to mature adequately to support the hemodialysis blood circuit. The Hemodialysis Fistula Maturation (HFM) Study was designed to elucidate clinical and biological factors associated with fistula maturation outcomes. Study Design Multicenter prospective cohort study. Setting & Participants Approximately 600 patients undergoing creation of a new hemodialysis fistula will be enrolled at 7 centers in the United States and followed up for as long as 4 years. Predictors Clinical, anatomical, biological, and process-of-care attributes identified pre-operatively, intra-operatively, or post-operatively. Outcomes The primary outcome is unassisted clinical maturation defined as successful use of the fistula for dialysis for four weeks without any maturation-enhancing procedures. Secondary outcomes include assisted clinical maturation, ultrasound-based anatomical maturation, fistula procedures, fistula abandonment, and central venous catheter use. Measurements Pre-operative ultrasound arterial and venous mapping, flow-mediated and nitroglycerin-mediated brachial artery dilation, arterial pulse wave velocity, and venous distensibility; intra-operative vein tissue collection for histopathological and molecular analyses; post-operative ultrasounds at 1 day, 2 weeks, 6 weeks, and prior to fistula intervention and initial cannulation. Results Assuming complete data, no covariate adjustment, and unassisted clinical maturation of 50%, there will be 80% power to detect ORs of 1.83 and 1.61 for dichotomous predictor variables with exposure prevalences of 20% and 50%, respectively. Limitations Exclusion of two-stage transposition fistulas limits generalizability. The requirement for study visits may result in a cohort that is healthier than the overall population of patients undergoing fistula creation. Conclusions The HFM Study will be of sufficient size and scope to 1) evaluate a broad range of mechanistic hypotheses, 2) identify clinical practices associated with maturation outcomes, 3) assess the predictive utility of early indicators of fistula outcome, and 4) establish targets for novel therapeutic interventions to improve fistula maturation. PMID:23992885
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasry, Inbal; Berman, Bluma; Glaser, Fabian
2009-08-28
The proton-coupled folate transporter (PCFT/SLC46A1) mediates intestinal folate uptake at acidic pH. Some loss of folic acid (FA) transport mutations in PCFT from hereditary folate malabsorption (HFM) patients cluster in R113, thereby suggesting a functional role for this residue. Herein, unlike non-conservative substitutions, an R113H mutant displayed 80-fold increase in the FA transport Km while retaining parental Vmax, hence indicating a major fall in folate substrate affinity. Furthermore, consistent with the preservation of 9% of parental transport activity, R113H transfectants displayed a substantial decrease in the FA growth requirement relative to mock transfectants. Homology modeling based on the crystal structuresmore » of the Escherichia coli transporter homologues EmrD and glycerol-3-phosphate transporter revealed that the R113H rotamer properly protrudes into the cytoplasmic face of the minor cleft normally occupied by R113. These findings constitute the first demonstration that a basic amino acid at position 113 is required for folate substrate binding.« less
Three-dimensional computational model of a blood oxygenator reconstructed from micro-CT scans.
D'Onofrio, C; van Loon, R; Rolland, S; Johnston, R; North, L; Brown, S; Phillips, R; Sienz, J
2017-09-01
Cardiopulmonary bypass procedures are one of the most common operations and blood oxygenators are the centre piece for the heart-lung machines. Blood oxygenators have been tested as entire devices but intricate details on the flow field inside the oxygenators remain unknown. In this study, a novel method is presented to analyse the flow field inside oxygenators based on micro Computed Tomography (μCT) scans. Two Hollow Fibre Membrane (HFM) oxygenator prototypes were scanned and three-dimensional full scale models that capture the device-specific fibre distributions are set up for computational fluid dynamics analysis. The blood flow through the oxygenator is modelled as a non-Newtonian fluid. The results were compared against the flow solution through an ideal fibre distribution and show the importance of a uniform distribution of fibres and that the oxygenators analysed are not susceptible to flow directionality as mass flow versus area remain the same. However the pressure drop across the oxygenator is dependent on flow rate and direction. By comparing residence time of blood against the time frame to fully saturate blood with oxygen we highlight the potential of this method as design optimisation tool. In conclusion, image-based reconstruction is found to be a feasible route to assess oxygenator performance through flow modelling. It offers the possibility to review a product as manufactured rather than as designed, which is a valuable insight as a precursor to the approval processes. Finally, the flow analysis presented may be extended, at computational cost, to include species transport in further studies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Velliquette, Rodney A; Grann, Kerry; Missler, Stephen R; Patterson, Jennifer; Hu, Chun; Gellenbeck, Kevin W; Scholten, Jeffrey D; Randolph, R Keith
2015-01-01
Diacylglyceride acyltransferase 1 (DGAT1) is the enzyme that adds the final fatty acid on to a diacylglyceride during triglyceride (TG) synthesis. DGAT1 plays a key role in the repackaging of dietary TG into circulating TG rich chylomicrons. A growing amount of research has indicated that an exaggerated postprandial circulating TG level is a risk indicator for cardiovascular and metabolic disorders. The aim of this research was to identify a botanical extract that inhibits intestinal DGAT1 activity and attenuates postprandial hypertriglyceridemia in overweight and obese humans. Twenty individual phytochemicals and an internal proprietary botanical extract library were screened with a primary cell-free DGAT1 enzyme assay that contained dioleoyl glycerol and palmitoleoyl Coenzyme A as substrates plus human intestinal microsomes as the DGAT1 enzyme source. Botanical extracts with IC50 values < 100 μg/mL were evaluated in a cellular DGAT1 assay. The cellular DGAT1 assay comprised the analysis of (14)C labeled TG synthesis in cells incubated with (14)C-glycerol and 0.3 mM oleic acid. Lead botanical extracts were then evaluated in a parallel, double-blind, placebo-controlled clinical trial. Ninety healthy, overweight and obese participants were randomized to receive 2 g daily of placebo or individual botanical extracts (the investigational product) for seven days. Serum TG levels were measured before and after consuming a high fat meal (HFM) challenge (0.354 L drink/shake; 77 g fat, 25 g carbohydrate and 9 g protein) as a marker of intestinal DGAT1 enzyme activity. Phenolic acids (i.e., gallic acid) and polyphenols (i.e., cyanidin) abundantly found in nature appeared to inhibit DGAT1 enzyme activity in vitro. Four polyphenolic rich botanical extracts were identified from in vitro evaluation in both cell-free and cellular model systems: apple peel extract (APE), grape extract (GE), red raspberry leaf extract (RLE) and apricot/nectarine extract (ANE) (IC50 = 1.4, 5.6, and 10.4 and 3.4 μg/mL, respectively). In the seven day clinical trial, compared to placebo, only GE significantly reduced the baseline subtracted change in serum TG AUC following consumption of the HFM (AUC = 281 ± 37 vs. 181 ± 30 mg/dL*h, respectively; P = 0.021). Chromatographic characterization of the GE revealed a large number of closely eluting components containing proanthocyanidins, catechins, anthocyanins and their secondary metabolites that corresponded with the observed DGAT1 enzyme inhibition in the cell-free model. These data suggest that a dietary GE has the potential to attenuate postprandial hypertriglyceridemia in part by the inhibition of intestinal DGAT1 enzyme activity without intolerable side effects. This trial was registered with ClinicalTrials.gov NCT02333461.
Kortner, Trond M; Penn, Michael H; Bjӧrkhem, Ingemar; Måsøval, Kjell; Krogdahl, Åshild
2016-09-07
The present study was undertaken to gain knowledge on the role of bile components and lecithin on development of aberrations in digestive functions which seemingly have increased in Atlantic salmon in parallel with the increased use of plant ingredients in fish feed. Post smolt Atlantic salmon were fed for 77 days one of three basal diets: a high fish meal diet (HFM), a low fishmeal diet (LFM), or a diet with high protein soybean meal (HPS). Five additional diets were made from the LFM diet by supplementing with: purified taurocholate (1.8 %), bovine bile salt (1.8 %), taurine (0.4 %), lecithin (1.5 %), or a mix of supplements (suppl mix) containing taurocholate (1.8 %), cholesterol (1.5 %) and lecithin (0.4 %). Two additional diets were made from the HPS diet by supplementing with: bovine bile salt (1.8 %) or the suppl mix. Body and intestinal weights were recorded, and blood, bile, intestinal tissues and digesta were sampled for evaluation of growth, nutrient metabolism and intestinal structure and function. In comparison with fish fed the HFM diet fish fed the LFM and HPS diets grew less and showed reduced plasma bile salt and cholesterol levels. Histological examination of the distal intestine showed signs of enteritis in both LFM and HPS diet groups, though more pronounced in the HPS diet group. The HPS diet reduced digesta dry matter and capacity of leucine amino peptidase in the distal intestine. None of the dietary supplements improved endpoints regarding fish performance, gut function or inflammation in the distal intestine. Some endpoints rather indicated negative effects. Dietary supplementation with bile components or lecithin in general did not improve endpoints regarding performance or gut health in Atlantic salmon, in clear contrast to what has been previously reported for rainbow trout. Follow-up studies are needed to clarify if lower levels of bile salts and cholesterol may give different and beneficial effects, or if other supplements, and other combinations of supplements might prevent or ameliorate inflammation in the distal intestine.
NASA Astrophysics Data System (ADS)
Zhang, Ning
A variety of biomaterials have been chronically implanted into the central nervous system (CNS) for repair or therapeutic purposes. Regardless of the application, chronic implantation of materials into the CNS induces injury and elicits a wound healing response, eventually leading to the formation of a dense extracellular matrix (ECM)-rich scar tissue that is associated with the segregation of implanted materials from the surrounding normal tissue. Often this reaction results in impaired performance of indwelling CNS devices. In order to enhance the performance of biomaterial-based implantable devices in the CNS, this thesis investigated whether adult brain tissue response to implanted biomaterials could be manipulated by changing biomaterial surface properties or further by utilizing the biology of co-transplanted cells. Specifically, the adult rat brain tissue response to chronically implanted poly(acrylonitrile-vinylchloride) (PAN-PVC) hollow fiber membranes (HFMs) of varying surface architecture were examined temporally at 2, 4, and 12 weeks postimplantation. Significant differences were discovered in the brain tissue response to the PAN-PVC HFMs of varying surface architecture at 4 and 12 weeks. To extend this work, whether the soluble factors derived from a co-transplanted cellular component further affect the brain tissue response to an implanted HFM in a significant way was critically exploited. The cells used were astrocytes, whose ability to influence scar formation process following CNS injury by physical contact with the host tissue had been documented in the literature. Data indicated for the first time that astrocyte-derived soluble factors ameliorate the adult brain tissue reactivity toward HFM implants in an age-dependent manner. While immature astrocytes secreted soluble factors that suppressed the brain tissue reactivity around the implants, mature astrocytes secreted factors that enhanced the gliotic response. These findings prove the feasibility of ameliorating the CNS tissue reactivity toward biomaterials implants by varying biomaterial surface properties or incorporating scar-reductive factors derived from functional cells into implant constructs, therefore, provide guidance in the design of more integrative biomaterial-based implantable devices for CNS repair.
Salden, Bouke N; Troost, Freddy J; de Groot, Eric; Stevens, Yala R; Garcés-Rimón, Marta; Possemiers, Sam; Winkens, Bjorn; Masclee, Ad A
2016-12-01
Endothelial dysfunction (ED) is involved in the development of atherosclerosis. Hesperidin, a citrus flavonoid with antioxidant and other biological properties, potentially exerts beneficial effects on endothelial function (EF). We investigated the effect of hesperidin 2S supplementation on EF in overweight individuals. This was a randomized, double-blind, placebo-controlled study in which 68 individuals were randomly assigned to receive hesperidin 2S (450 mg/d) or a placebo for 6 wk. At baseline and after 6 wk of intervention, flow-mediated dilation (FMD), soluble vascular adhesion molecule-1 (sVCAM-1), soluble intracellular adhesion molecule-1 (sICAM-1), soluble P-selectin (sP-selectin), systolic blood pressure (SBP), and diastolic blood pressure (DBP) were assessed. Acute, reversible ED was induced by intake of a high-fat meal (HFM). A second FMD scan was performed 2 h postprandially, and adhesion molecules were assessed 2 and 4 h postprandially. An additional exploratory analysis was performed in subjects with baseline FMD ≥3%. No significant change in fasting or postprandial FMD was observed after 6 wk of hesperidin intake compared with placebo intake. However, there was a trend for a reduction of sVCAM-1, sICAM-1, sP-selectin, SBP, and DBP after 6 wk of hesperidin treatment. In the FMD ≥3% group, hesperidin protected individuals from postprandial ED (P = 0.050) and significantly downregulated sVCAM-1 and sICAM-1 (all P ≤ 0.030). The results reported in the current article were not adjusted for multiplicity. Six weeks of consumption of hesperidin 2S did not improve basal or postprandial FMD in our total study population. There was a tendency toward a reduction of adhesion molecules and a decrease in SBP and DBP. Further exploratory analyses revealed that, in subjects with baseline FMD ≥3%, hesperidin 2S improved ED after an HFM and reduced adhesion molecules. These results indicate the cardiovascular health benefits of hesperidin 2S in overweight and obese individuals with a relatively healthy endothelium. This trial was registered at clinicaltrials.gov as NCT02228291. © 2016 American Society for Nutrition.
Madhani, Shalv P; D'Aloiso, Brandon D; Frankowski, Brian; Federspiel, William J
2016-01-01
Hollow fiber membranes (HFMs) are used in blood oxygenators for cardiopulmonary bypass or in next generation artificial lungs. Flow analyses of these devices is typically done using computational fluid dynamics (CFD) modeling HFM bundles as porous media, using a Darcy permeability coefficient estimated from the Blake-Kozeny (BK) equation to account for viscous drag from fibers. We recently published how well this approach can predict Darcy permeability for fiber bundles made from polypropylene HFMs, showing the prediction can be significantly improved using an experimentally derived correlation between the BK constant (A) and bundle porosity (ε). In this study, we assessed how well our correlation for A worked for predicting the Darcy permeability of fiber bundles made from Membrana polymethylpentene (PMP) HFMs, which are increasingly being used clinically. Swatches in the porosity range of 0.4 to 0.8 were assessed in which sheets of fiber were stacked in parallel, perpendicular, and angled configurations. Our previously published correlation predicted Darcy within ±8%. A new correlation based on current and past measured permeability was determined: A = 497ε - 103; using this correlation measured Darcy permeability was within ±6%. This correlation varied from 8% to -3.5% of our prior correlation over the tested porosity range.
Madhani, Shalv. P.; D’Aloiso, Brandon. D.; Frankowski, Brian.; Federspiel, William. J.
2016-01-01
Hollow fiber membranes (HFMs) are used in blood oxygenators for cardiopulmonary bypass or in next generation artificial lungs. Flow analyses of these devices is typically done using computational fluid dynamics (CFD) modeling HFM bundles as porous media, using a Darcy permeability coefficient estimated from the Blake – Kozeny (BK) equation to account for viscous drag from fibers. We recently published how well this approach can predict Darcy permeability for fiber bundles made from polypropylene HFMs, showing the prediction can be significantly improved using an experimentally derived correlation between the BK constant (A) and bundle porosity (ε). In this study, we assessed how well our correlation for A worked for predicting the Darcy permeability of fiber bundles made from Membrana® polymethylpentene (PMP) HFMs, which are increasingly being used clinically. Swatches in the porosity range of 0.4 to 0.8 were assessed in which sheets of fiber were stacked in parallel, perpendicular and angled configurations. Our previously published correlation predicted Darcy within ±8%. A new correlation based on current and past measured permeability was determined: A=497ε-103; using this correlation measured Darcy permeability was within ±6%. This correlation varied from 8% to −3.5% of our prior correlation over the tested porosity range. PMID:26809086
NASA Technical Reports Server (NTRS)
Reynolds, Thomas L.; Eklund, Thor I.; Haack, Gregory A.
2001-01-01
This purpose of this contract study task was to investigate the State of the Art in Gas Separation Technologies utilized for separating air into both nitrogen and oxygen gases for potential applications on commercial aircraft. The intended applications included: nitrogen gas for fuel tank inerting, cargo compartment fire protection, and emergency oxygen for passenger and crew use in the event of loss of cabin pressure. The approach was to investigate three principle methods of gas separation: Hollow Fiber Membrane (HFM), Ceramic Membrane (CM), and liquefaction: Total Atmospheric Liquefaction of Oxygen and Nitrogen (TALON). Additional data on the performance of molecular sieve pressure swing adsorption (PSA) systems was also collected and discussed. Performance comparisons of these technologies are contained in the body of the report.
Drug-induced Liver Disease in Patients with Diabetes Mellitus.
Iryna, Klyarytskaya; Helen, Maksymova; Elena, Stilidi
2015-01-01
The study presented here was accomplished to assess the course of drug-induced liver diseases in patient's rheumatoid arthritis receiving long-term methotrexate therapy. Diabetes mellitus was revealed as the most significant risk factor. The combination of diabetes mellitus with other risk factors (female sex) resulted in increased hepatic fibrosis, degree of hepatic encephalopathy and reduction of hepatic functions. The effectiveness and safety of ursodeoxycholic acid and cytolytic type-with S-Adenosyl methionine was also evaluated. 13C-MBT: 13C-methacetin breath test; ALT: alanine aminotransferase; AP: alkaline phosphatase; AST: aspartic transaminase; DILD: drug-induced liver disease; DM: diabetes mellitus; HE: hepatic encephalopathy; HFM: hepatic functional mass; SAMe: S-Adenosyl methionine; UDCA: ursodeoxycholic acid. Iryna K, Helen M, Elena S. Drug-induced Liver Disease in Patients with Diabetes Mellitus. Euroasian J Hepato-Gastroenterol 2015;5(2):83-86.
Hydrostructural maps of the Death Valley regional flow system, Nevada and California
Potter, C.J.; Sweetkind, D.S.; Dickerson, R.P.; Killgore, M.L.
2002-01-01
The locations of principal faults and structural zones that may influence ground-water flow were compiled in support of a three-dimensional ground-water model for the Death Valley regional flow system (DVRFS), which covers 80,000 square km in southwestern Nevada and southeastern California. Faults include Neogene extensional and strike-slip faults and pre-Tertiary thrust faults. Emphasis was given to characteristics of faults and deformed zones that may have a high potential for influencing hydraulic conductivity. These include: (1) faulting that results in the juxtaposition of stratigraphic units with contrasting hydrologic properties, which may cause ground-water discharge and other perturbations in the flow system; (2) special physical characteristics of the fault zones, such as brecciation and fracturing, that may cause specific parts of the zone to act either as conduits or as barriers to fluid flow; (3) the presence of a variety of lithologies whose physical and deformational characteristics may serve to impede or enhance flow in fault zones; (4) orientation of a fault with respect to the present-day stress field, possibly influencing hydraulic conductivity along the fault zone; and (5) faults that have been active in late Pleistocene or Holocene time and areas of contemporary seismicity, which may be associated with enhanced permeabilities. The faults shown on maps A and B are largely from Workman and others (in press), and fit one or more of the following criteria: (1) faults that are more than 10 km in map length; (2) faults with more than 500 m of displacement; and (3) faults in sets that define a significant structural fabric that characterizes a particular domain of the DVRFS. The following fault types are shown: Neogene normal, Neogene strike-slip, Neogene low-angle normal, pre-Tertiary thrust, and structural boundaries of Miocene calderas. We have highlighted faults that have late Pleistocene to Holocene displacement (Piety, 1996). Areas of thick Neogene basin-fill deposits (thicknesses 1-2 km, 2-3 km, and >3 km) are shown on map A, based on gravity anomalies and depth-to-basement modeling by Blakely and others (1999). We have interpreted the positions of faults in the subsurface, generally following the interpretations of Blakely and others (1999). Where geophysical constraints are not present, the faults beneath late Tertiary and Quaternary cover have been extended based on geologic reasoning. Nearly all of these concealed faults are shown with continuous solid lines on maps A and B, in order to provide continuous structures for incorporation into the hydrogeologic framework model (HFM). Map A also shows the potentiometric surface, regional springs (25-35 degrees Celsius, D'Agnese and others, 1997), and cold springs (Turner and others, 1996).
Hafnium Isotopic Variations in Central Atlantic Intraplate Volcanism
NASA Astrophysics Data System (ADS)
Geldmacher, J.; Hanan, B. B.; Hoernle, K.; Blichert-Toft, J.
2008-12-01
Although one of the geochemically best investigated volcanic regions on Earth, almost no Hf isotopic data have been published from the broad belt of intraplate seamounts and islands in the East Atlantic between 25° and 36° N. This study presents 176Hf/177Hf ratios from 61 representative samples from the Canary, Selvagen and Madeira Islands and nearby large seamounts, encompassing the full range of different evolutionary stages and geochemical endmembers. The majority of samples have mafic, mainly basaltic compositions with Mg-numbers within or near the range of magmas in equilibrium with mantle olivine (68-75). No correlation was found between Mg-number and 176Hf/177Hf ratios in the data set. In comparison to observed Nd isotope variations published for this volcanic province (6 ɛNd units), 176Hf/177Hf ratios span a larger range (14 ɛHf units). Samples from the Madeira archipelago have the most radiogenic compositions (176Hf/177Hfm= 0.283132-0.283335), widely overlapping the field for central Atlantic N-MORB. They form a relatively narrow, elongated trend (stretching over >6 ɛHf units) between a radiogenic MORB-like endmember and a composition located on the Nd-Hf mantle array. In contrast, all Canary Islands samples plot below the mantle array (176Hf/177Hfm = 0.282943-0.283067) and, despite being from an archipelago that stretches over a much larger geographic area, form a much denser cluster with less compositional variation (~4 ɛHf units). All samples from the seamounts NE of the Canaries, proposed to belong to the same Canary hotspot track (e.g. Geldmacher et al., 2001, JVGR 111; Geldmacher et al., 2005, EPSL 237), fall within the Hf isotopic range of this cluster. The cluster largely overlaps the composition of the proposed common mantle endmember 'C' (Hanan and Graham, 1996, Science 272) but spans a space between a more radiogenic (depleted) composition and a HIMU-type endmember. Although samples of Seine and Unicorn seamounts, attributed to the Madeira hotspot track, show less radiogenic Hf and Nd isotope ratios than Madeira, their isotopic compositions lie along an extension of the Madeira trend in plots of Hf versus Sr, Nd, Pb isotopes. The new Hf isotope ratios confirm the existence of at least two geochemically distinct volcanic provinces (Canary and Madeira) in the East Atlantic as previously proposed.
Mercury transfer from soil to olive trees. A comparison of three different contaminated sites.
Higueras, Pablo L; Amorós, José Á; Esbrí, José Maria; Pérez-de-los-Reyes, Caridad; López-Berdonces, Miguel A; García-Navarro, Francisco J
2016-04-01
Mercury contents in soil and olive tree leaves have been studied in 69 plots around three different source areas of this element in Spain: Almadén (Ciudad Real), Flix (Tarragona) and Jódar (Jaén). Almadén was the world's largest cinnabar (HgS) mining district and was active until 2003, Flix is the oldest Spanish chlor-alkali plant (CAP) and has been active from 1898 to the present day and Jódar is a decommissioned CAP that was active for 14 years (1977-1991). Total mercury contents have been measured by high-frequency modulation atomic absorption spectrometry with Zeeman effect (ZAAS-HFM) in the soils and olive tree leaves from the three studied areas. The average soil contents range from 182 μg kg(-1) in Flix to 23,488 μg kg(-1) in Almadén, while the average leaf content ranges from 161 μg kg(-1) in Jódar to 1213 μg kg(-1) in Almadén. Despite the wide range of data, a relationship between soil-leaf contents has been identified: in Almadén and Jódar, multiplicative (bilogarithmic) models show significant correlations (R = 0.769 and R = 0.484, respectively). Significant correlations were not identified between soil and leaf contents in Flix. The continuous activity of the Flix CAP, which remains open today, can explain the different uptake patterns for mercury, which is mainly atmospheric in origin, in comparison to the other two sites, where activity ceased more than 10 years ago and only soil uptake patterns based on the Michaelis-Menten enzymatic model curve are observed.
van Rijn, Peter W; Ali, Usama S
2017-05-01
We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures. © 2017 The British Psychological Society.
A complete categorization of multiscale models of infectious disease systems.
Garira, Winston
2017-12-01
Modelling of infectious disease systems has entered a new era in which disease modellers are increasingly turning to multiscale modelling to extend traditional modelling frameworks into new application areas and to achieve higher levels of detail and accuracy in characterizing infectious disease systems. In this paper we present a categorization framework for categorizing multiscale models of infectious disease systems. The categorization framework consists of five integration frameworks and five criteria. We use the categorization framework to give a complete categorization of host-level immuno-epidemiological models (HL-IEMs). This categorization framework is also shown to be applicable in categorizing other types of multiscale models of infectious diseases beyond HL-IEMs through modifying the initial categorization framework presented in this study. Categorization of multiscale models of infectious disease systems in this way is useful in bringing some order to the discussion on the structure of these multiscale models.
A UML profile for framework modeling.
Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong
2004-01-01
The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.
Use of Annotations for Component and Framework Interoperability
NASA Astrophysics Data System (ADS)
David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.
2009-12-01
The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.
A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...
Narrative review of frameworks for translating research evidence into policy and practice.
Milat, Andrew J; Li, Ben
2017-02-15
A significant challenge in research translation is that interested parties interpret and apply the associated terms and conceptual frameworks in different ways. The purpose of this review was to: a) examine different research translation frameworks; b) examine the similarities and differences between the frameworks; and c) identify key strengths and weaknesses of the models when they are applied in practice. The review involved a keyword search of PubMed. The search string was (translational research OR knowledge translation OR evidence to practice) AND (framework OR model OR theory) AND (public health OR health promotion OR medicine). Included studies were published in English between January 1990 and December 2014, and described frameworks, models or theories associated with research translation. The final review included 98 papers, and 41 different frameworks and models were identified. The most frequently applied knowledge translation framework in the literature was RE-AIM, followed by the knowledge translation continuum or 'T' models, the Knowledge to Action framework, the PARiHS framework, evidence based public health models, and the stages of research and evaluation model. The models identified in this review stem from different fields, including implementation science, basic and medical sciences, health services research and public health, and propose different but related pathways to closing the research-practice gap.
The code base for creating versions of the USEEIO model and USEEIO-like models is called the USEEIO Modeling Framework. The framework is built in a combination of R and Python languages.This demonstration provides a brief overview and introduction into the framework.
Model and Interoperability using Meta Data Annotations
NASA Astrophysics Data System (ADS)
David, O.
2011-12-01
Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.
A Framework for Developing the Structure of Public Health Economic Models.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-01-01
A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Liu, Y F; Yu, H; Wang, W N; Gao, B
2017-06-09
Objective: To evaluate the processing accuracy, internal quality and suitability of the titanium alloy frameworks of removable partial denture (RPD) fabricated by selective laser melting (SLM) technique, and to provide reference for clinical application. Methods: The plaster model of one clinical patient was used as the working model, and was scanned and reconstructed into a digital working model. A RPD framework was designed on it. Then, eight corresponding RPD frameworks were fabricated using SLM technique. Three-dimensional (3D) optical scanner was used to scan and obtain the 3D data of the frameworks and the data was compared with the original computer aided design (CAD) model to evaluate their processing precision. The traditional casting pure titanium frameworks was used as the control group, and the internal quality was analyzed by X-ray examination. Finally, the fitness of the frameworks was examined on the plaster model. Results: The overall average deviation of the titanium alloy RPD framework fabricated by SLM technology was (0.089±0.076) mm, the root mean square error was 0.103 mm. No visible pores, cracks and other internal defects was detected in the frameworks. The framework fits on the plaster model completely, and its tissue surface fitted on the plaster model well. There was no obvious movement. Conclusions: The titanium alloy RPD framework fabricated by SLM technology is of good quality.
Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather
2017-11-28
There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners with a menu of potentially usable theories, models and frameworks to support capacity building efforts. The findings also support the need for the use of theories, models or frameworks to be intentional, explicitly identified, referenced and for it to be clearly outlined how they were applied to the capacity building intervention.
Sabale, Siddharth S.; Kadam, Dattatray P.; Sarkate, Laxman B.; Bellare, Jayesh R.
2011-01-01
Polysulfone (Psf) hollow fiber membranes (HFMs) have been widely used in blood purification but their biocompatibility remains a concern. To enhance their biocompatibility, Psf/TPGS (d-α-tocopheryl polyethylene glycol 1000 succinate) composite HFMs and 2-methacryloyloxyethyl phosphorylcholine (MPC) coated Psf HFMs have been prepared. They have been evaluated for in vivo biocompatibility and graft acceptance and compared with sham and commercial membranes by intra-peritoneal implantation in rats at day 7 and 21. Normal body weights, tissue formation and angiogenesis indicate acceptance of implants by the animals. Hematological observations show presence of post-surgical stress which subsides over time. Serum biochemistry results reveal normal organ function and elevated liver ALP levels at day 21. Histological studies exhibit fibroblast recruitment cells, angiogenesis and collagen deposition at the implant surface indicating new tissue formation. Immuno-histochemistry studies show non-activation of MHC molecules signifying biocompatibilty. Additionally, Psf/TPGS exhibit most favorable tissue response as compared with other HFMs making them the material of choice for HFM preparation for hemodialysis applications. PMID:22046236
[The effect of mandibular distraction on the maxilla growth in children with hemifacial microsomia].
Yang, Renkai; Tang, Xiaojun; Shi, Lei; Yin, Lin; Yang, Bin; Yin, Hongyu; Liu, Wei; Zhang, Zhiyong
2014-11-01
To analyze the effect of mandibular distraction on the maxilla growth in children with hemifacial microsomia through measurement with the posterior-anterior cephalometric X-ray films and Three-dimensional CT reconstruction images. The deviation angular of maxilla occlusion plane and nasal base plane from the infra-orbital plane were measured on the posterior-anterior cephalometric X-ray films in 22 patients before and half a year after operation. The vertical distance from the midpoint of 5th teeth alveolar and the lowest point of maxillary sinus to reference plane were measured on 3D reconstruction images in 15 patients. The data were statistically analyzed. On posterior-anterior cephalometric X-ray films, the cant of occlusion plane were significantly reduced (P < 0.05), While the angular of nasal base plane and the infra-orbital plane had no significant change. On 3D reconstruction images, all the detection points had significantly declined except the lowest point of maxillary sinus on normal side. Distraction osteogenesis of mandible can promote the growth of the maxilla in children with HFM, the accelerated growth parts include alveolar bone and maxillary sinus.
NASA Astrophysics Data System (ADS)
Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.
2014-12-01
In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.
A Model Independent S/W Framework for Search-Based Software Testing
Baik, Jongmoon
2014-01-01
In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314
Business model framework applications in health care: A systematic review.
Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl
2017-11-01
It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.
A Modular Simulation Framework for Assessing Swarm Search Models
2014-09-01
SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework
EPA'S NEW EMISSIONS MODELING FRAMEWORK
EPA's Office of Air Quality Planning and Standards is building a new Emissions Modeling Framework that will solve many of the long-standing difficulties of emissions modeling. The goals of the Framework are to (1) prevent bottlenecks and errors caused by emissions modeling activi...
Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model
ERIC Educational Resources Information Center
Berman, Jeanette; Smyth, Robyn
2015-01-01
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
Rethinking modeling framework design: object modeling system 3.0
USDA-ARS?s Scientific Manuscript database
The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...
Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework
ERIC Educational Resources Information Center
Chen, Huilin; Chen, Jinsong
2016-01-01
Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…
Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results
ERIC Educational Resources Information Center
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-01-01
We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…
The Foundations Framework for Developing and Reporting New Models of Care for Multimorbidity
Stokes, Jonathan; Man, Mei-See; Guthrie, Bruce; Mercer, Stewart W.; Salisbury, Chris; Bower, Peter
2017-01-01
PURPOSE Multimorbidity challenges health systems globally. New models of care are urgently needed to better manage patients with multimorbidity; however, there is no agreed framework for designing and reporting models of care for multimorbidity and their evaluation. METHODS Based on findings from a literature search to identify models of care for multimorbidity, we developed a framework to describe these models. We illustrate the application of the framework by identifying the focus and gaps in current models of care, and by describing the evolution of models over time. RESULTS Our framework describes each model in terms of its theoretical basis and target population (the foundations of the model) and of the elements of care implemented to deliver the model. We categorized elements of care into 3 types: (1) clinical focus, (2) organization of care, (3) support for model delivery. Application of the framework identified a limited use of theory in model design and a strong focus on some patient groups (elderly, high users) more than others (younger patients, deprived populations). We found changes in elements with time, with a decrease in models implementing home care and an increase in models offering extended appointments. CONCLUSIONS By encouragin greater clarity about the underpinning theory and target population, and by categorizing the wide range of potentially important elements of an intervention to improve care for patients with multimorbidity, the framework may be useful in designing and reporting models of care and help advance the currently limited evidence base. PMID:29133498
BioASF: a framework for automatically generating executable pathway models specified in BioPAX.
Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap
2016-06-15
Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.
Comparison and Contrast of Two General Functional Regression Modeling Frameworks
Morris, Jeffrey S.
2017-01-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502
Comparison and Contrast of Two General Functional Regression Modeling Frameworks.
Morris, Jeffrey S
2017-02-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.
Enterprise application architecture development based on DoDAF and TOGAF
NASA Astrophysics Data System (ADS)
Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng
2017-05-01
For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016
A framework for sharing and integrating remote sensing and GIS models based on Web service.
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.
A modelling framework to simulate foliar fungal epidemics using functional–structural plant models
Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe
2014-01-01
Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both previously developed models for individual aspects of pathosystems and new ones. Complex models are deconstructed into separate ‘knowledge sources’ originating from different specialist areas of expertise and these can be shared and reassembled into multidisciplinary models. The framework thus provides a beneficial tool for a potential diverse and dynamic research community. PMID:24925323
Documentation for the MODFLOW 6 framework
Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.
2017-08-10
MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.
A conceptual modeling framework for discrete event simulation using hierarchical control structures.
Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D
2015-08-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.
A framework for modelling the complexities of food and water security under globalisation
NASA Astrophysics Data System (ADS)
Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.
2018-01-01
We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.
DOT National Transportation Integrated Search
1993-12-01
This report presents a comprehensive modeling framework for user responses to Advanced Traveler Information Systems (ATIS) services and identifies the data needs for the validation of such a framework. The authors present overviews of the framework b...
Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework
ERIC Educational Resources Information Center
Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.
2012-01-01
We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Developing a theoretical framework for complex community-based interventions.
Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana
2014-01-01
Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.
Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J
2012-01-01
The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.
Template-Based Geometric Simulation of Flexible Frameworks
Wells, Stephen A.; Sartbaeva, Asel
2012-01-01
Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055
A flexible framework has been created for modeling multi-dimensional hydrological and water quality processes within stormwater green infrastructures (GIs). The framework models a GI system using a set of blocks (spatial features) and connectors (interfaces) representing differen...
A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...
Space-Time Processing for Tactical Mobile Ad Hoc Networks
2008-08-01
vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under
A conceptual modeling framework for discrete event simulation using hierarchical control structures
Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.
2015-01-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940
USDA-ARS?s Scientific Manuscript database
AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
A generic framework for individual-based modelling and physical-biological interaction
2018-01-01
The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280
Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages
2011-01-01
important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.
Advanced Computational Framework for Environmental Management ZEM, Version 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin
2016-11-04
Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less
Model-Based Reasoning in Upper-division Lab Courses
NASA Astrophysics Data System (ADS)
Lewandowski, Heather
2015-05-01
Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.
A Simulation and Modeling Framework for Space Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S
This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
Jang, Nulee; Yasin, Muhammad; Kang, Hyunsoo; Lee, Yeubin; Park, Gwon Woo; Park, Shinyoung; Chang, In Seop
2018-05-04
This study investigated the effects of electrolytes (CaCl 2 , K 2 HPO 4 , MgSO 4 , NaCl, and NH 4 Cl) on CO mass transfer and ethanol production in a HFMBR. The hollow fiber membranes (HFM) were found to generate tiny gas bubbles; the bubble coalescence was significantly suppressed in electrolyte solution. The volumetric gas-liquid mass transfer coefficients (k L a) increased up to 414% compared to the control. Saturated CO (C ∗ ) decreased as electrolyte concentrations increased. Overall, the maximum mass transfer rate (R max ) in electrolyte solution ranged from 106% to 339% of the value obtained in water. The electrolyte toxicity on cell growth was tested using Clostridium autoethanogenum. Most electrolytes, except for MgSO 4 , inhibited cell growth. The HFMBR operation using a medium containing 1% MgSO 4 achieved 119% ethanol production compared to that without electrolytes. Finally, a kinetic simulation using the parameters got from the 1% MgSO 4 medium predicted a higher ethanol production compared to the control. Copyright © 2018 Elsevier Ltd. All rights reserved.
High productivity mould robotic milling in Al-5083
NASA Astrophysics Data System (ADS)
Urresti, Iker; Arrazola, Pedro Jose; Ørskov, Klaus Bonde; Pelegay, Jose Angel
2018-05-01
Industrial serial robots were usually limited to welding, handling or spray painting operations until very recent years. However, some industries have already realized about their important capabilities in terms of flexibility, working space, adaptability and cost. Hence, currently they are seriously being considered to carry out certain metal machining tasks. Therefore, robot based machining is presented as a cost-saving and flexible manufacturing alternative compared to conventional CNC machines especially for roughing or even pre-roughing of large parts. Nevertheless, there are still some drawbacks usually referred as low rigidity, accuracy and repeatability. Thus, the process productivity is usually sacrificed getting low Material Removal Rates (MRR), and consequently not being competitive. Nevertheless, in this paper different techniques to obtain increased productivity are presented, though an appropriate selection of cutting strategies and parameters that are essential for it. During this research some rough milling tests in Al-5083 are presented where High Feed Milling (HFM) is implemented as productive cutting strategy and the experimental modal analysis named Tap-testing is used for the suitable choice of cutting conditions. Competitive productivity rates are experienced while process stability is checked through the cutting forces measurements in order to prove the effectiveness of the experimental modal analysis for robotic machining.
ERIC Educational Resources Information Center
Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele
2013-01-01
We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…
The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval
2006-07-01
reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We
Modelling Participatory Geographic Information System for Customary Land Conflict Resolution
NASA Astrophysics Data System (ADS)
Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.
2017-11-01
Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.
NASA Technical Reports Server (NTRS)
1977-01-01
The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
NASA Technical Reports Server (NTRS)
Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.
2006-01-01
Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.
Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
Model-based reasoning in the physics laboratory: Framework and initial results
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
[Computer aided design and rapid manufacturing of removable partial denture frameworks].
Han, Jing; Lü, Pei-jun; Wang, Yong
2010-08-01
To introduce a method of digital modeling and fabricating removable partial denture (RPD) frameworks using self-developed software for RPD design and rapid manufacturing system. The three-dimensional data of two partially dentate dental casts were obtained using a three-dimensional crossing section scanner. Self-developed software package for RPD design was used to decide the path of insertion and to design different components of RPD frameworks. The components included occlusal rest, clasp, lingual bar, polymeric retention framework and maxillary major connector. The design procedure for the components was as following: first, determine the outline of the component. Second, build the tissue surface of the component using the scanned data within the outline. Third, preset cross section was used to produce the polished surface. Finally, different RPD components were modeled respectively and connected by minor connectors to form an integrated RPD framework. The finished data were imported into a self-developed selective laser melting (SLM) machine and metal frameworks were fabricated directly. RPD frameworks for the two scanned dental casts were modeled with this self-developed program and metal RPD frameworks were successfully fabricated using SLM method. The finished metal frameworks fit well on the plaster models. The self-developed computer aided design and computer aided manufacture (CAD-CAM) system for RPD design and fabrication has completely independent intellectual property rights. It provides a new method of manufacturing metal RPD frameworks.
Calibration and Propagation of Uncertainty for Independence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham
This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.
An active monitoring method for flood events
NASA Astrophysics Data System (ADS)
Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya
2018-07-01
Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.
Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank
2006-04-01
This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.
A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks
Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng
2009-01-01
Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885
Generic framework for mining cellular automata models on protein-folding simulations.
Diaz, N; Tischer, I
2016-05-13
Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.
First-Order Frameworks for Managing Models in Engineering Optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natlia M.; Lewis, Robert Michael
2000-01-01
Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.
GeoFramework: A Modeling Framework for Solid Earth Geophysics
NASA Astrophysics Data System (ADS)
Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.
2003-12-01
As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.
Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R
2015-06-17
In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.
Impact of Compound Hydrate Dynamics on Phase Boundary Changes
NASA Astrophysics Data System (ADS)
Osegovic, J. P.; Max, M. D.
2006-12-01
Compound hydrate reactions are affected by the local concentration of hydrate forming materials (HFM). The relationship between HFM composition and the phase boundary is as significant as temperature and pressure. Selective uptake and sequestration of preferred hydrate formers (PF) has wide ranging implications for the state and potential use of natural hydrate formation, including impact on climate. Rising mineralizing fluids of hydrate formers (such as those that occur on Earth and are postulated to exist elsewhere in the solar system) will sequester PF before methane, resulting in a positive relationship between depth and BTU content as ethane and propane are removed before methane. In industrial settings the role of preferred formers can separate gases. When depressurizing gas hydrate to release the stored gas, the hydrate initial composition will set the decomposition phase boundary because the supporting solution takes on the composition of the hydrate phase. In other settings where hydrate is formed, transported, and then dissociated, similar effects can control the process. The behavior of compound hydrate systems can primarily fit into three categories: 1) In classically closed systems, all the material that can form hydrate is isolated, such as in a sealed laboratory vessel. In such systems, formation and decomposition are reversible processes with observed hysteresis related to mass or heat transfer limitations, or the order and magnitude in which individual hydrate forming gases are taken up from the mixture and subsequently released. 2) Kinetically closed systems are exposed to a solution mass flow across a hydrate mass. These systems can have multiple P-T phase boundaries based on the local conditions at each face of the hydrate mass. A portion of hydrate that is exposed to fresh mineralizing solution will contain more preferred hydrate formers than another portion that is exposed to a partially depleted solution. Examples of kinetically closed systems include pipeline blockages and natural hydrate concentrations associated with upwelling fluids in marine sediments. 3) In open systems, mass can either flow into or out of a system. In such situations compound hydrate will form or decompose to re-establish chemical equilibrium. This is accomplished by 1) loading/consuming a preferred hydrate former to/from the surroundings, 2) lowering/raising the temperature of the system, and 3) increasing the local pressure. Examples of this type of system include hydrate produced for low pressure transport, depressurized or superheated hydrate settings (pipeline remediation or energy recovery), or in an industrial process where formation of compound hydrates may be used to separate and concentrate gases from a mixture. The relationship between composition and the phase boundary is as important as pressure and temperature effects. Composition is less significant for simple hydrates where the hydrate behaves as a one-component mineral, but for compound hydrate, feedback between pressure, temperature, and composition can result in complex system behavior.
Multicriteria framework for selecting a process modelling language
NASA Astrophysics Data System (ADS)
Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel
2016-01-01
The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.
State Event Models for the Formal Analysis of Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles
2014-01-01
The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.
Physiome-model-based state-space framework for cardiac deformation recovery.
Wong, Ken C L; Zhang, Heye; Liu, Huafeng; Shi, Pengcheng
2007-11-01
To more reliably recover cardiac information from noise-corrupted, patient-specific measurements, it is essential to employ meaningful constraining models and adopt appropriate optimization criteria to couple the models with the measurements. Although biomechanical models have been extensively used for myocardial motion recovery with encouraging results, the passive nature of such constraints limits their ability to fully count for the deformation caused by active forces of the myocytes. To overcome such limitations, we propose to adopt a cardiac physiome model as the prior constraint for cardiac motion analysis. The cardiac physiome model comprises an electric wave propagation model, an electromechanical coupling model, and a biomechanical model, which are connected through a cardiac system dynamics for a more complete description of the macroscopic cardiac physiology. Embedded within a multiframe state-space framework, the uncertainties of the model and the patient's measurements are systematically dealt with to arrive at optimal cardiac kinematic estimates and possibly beyond. Experiments have been conducted to compare our proposed cardiac-physiome-model-based framework with the solely biomechanical model-based framework. The results show that our proposed framework recovers more accurate cardiac deformation from synthetic data and obtains more sensible estimates from real magnetic resonance image sequences. With the active components introduced by the cardiac physiome model, cardiac deformations recovered from patient's medical images are more physiologically plausible.
Wu, Yiping; Liu, Shu-Guang
2012-01-01
R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.
Modeling asset price processes based on mean-field framework
NASA Astrophysics Data System (ADS)
Ieda, Masashi; Shiino, Masatoshi
2011-12-01
We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...
2018-02-20
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
NASA Astrophysics Data System (ADS)
Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng
2018-02-01
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson; Feng, Zhe; Plant, Robert S.
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less
A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services
NASA Astrophysics Data System (ADS)
Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.
2015-12-01
Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.
USDA-ARS?s Scientific Manuscript database
The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...
A framework for modeling uncertainty in regional climate change
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...
A Framework for Dimensionality Assessment for Multidimensional Item Response Models
ERIC Educational Resources Information Center
Svetina, Dubravka; Levy, Roy
2014-01-01
A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…
Stress distribution in Co-Cr implant frameworks after laser or TIG welding.
de Castro, Gabriela Cassaro; de Araújo, Cleudmar Amaral; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Nóbilo, Mauro Antônio de Arruda
2013-01-01
Lack of passivity has been associated with biomechanical problems in implant-supported prosthesis. The aim of this study was to evaluate the passivity of three techniques to fabricate an implant framework from a Co-Cr alloy by photoelasticity. The model was obtained from a steel die simulating an edentulous mandible with 4 external hexagon analog implants with a standard platform. On this model, five frameworks were fabricated for each group: a monoblock framework (control), laser and TIG welding frameworks. The photoelastic model was made from a flexible epoxy resin. On the photoelastic analysis, the frameworks were bolted onto the model for the verification of maximum shear stress at 34 selected points around the implants and 5 points in the middle of the model. The stresses were compared all over the photoelastic model, between the right, left, and center regions and between the cervical and apical regions. The values were subjected to two-way ANOVA, and Tukey's test (α=0.05). There was no significant difference among the groups and studied areas (p>0.05). It was concluded that the stresses generated around the implants were similar for all techniques.
Model-theoretic framework for sensor data fusion
NASA Astrophysics Data System (ADS)
Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.
1993-09-01
The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette
2008-01-01
A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…
A general modeling framework for describing spatially structured population dynamics
Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan
2017-01-01
Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles
Entity-Centric Abstraction and Modeling Framework for Transportation Architectures
NASA Technical Reports Server (NTRS)
Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.
2007-01-01
A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.
Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.
Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael
2018-01-01
The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.
A new fit-for-purpose model testing framework: Decision Crash Tests
NASA Astrophysics Data System (ADS)
Tolson, Bryan; Craig, James
2016-04-01
Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.
Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)
NASA Technical Reports Server (NTRS)
Gray, Justin S.; Briggs, Jeffery L.
2008-01-01
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...
2016-04-25
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
Enhancing a socio-hydrological modelling framework through field observations: a case study in India
NASA Astrophysics Data System (ADS)
den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.
2016-04-01
Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies
NASA Astrophysics Data System (ADS)
Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.
2018-06-01
We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
A smoothed particle hydrodynamics framework for modelling multiphase interactions at meso-scale
NASA Astrophysics Data System (ADS)
Li, Ling; Shen, Luming; Nguyen, Giang D.; El-Zein, Abbas; Maggi, Federico
2018-01-01
A smoothed particle hydrodynamics (SPH) framework is developed for modelling multiphase interactions at meso-scale, including the liquid-solid interaction induced deformation of the solid phase. With an inter-particle force formulation that mimics the inter-atomic force in molecular dynamics, the proposed framework includes the long-range attractions between particles, and more importantly, the short-range repulsive forces to avoid particle clustering and instability problems. Three-dimensional numerical studies have been conducted to demonstrate the capabilities of the proposed framework to quantitatively replicate the surface tension of water, to model the interactions between immiscible liquids and solid, and more importantly, to simultaneously model the deformation of solid and liquid induced by the multiphase interaction. By varying inter-particle potential magnitude, the proposed SPH framework has successfully simulated various wetting properties ranging from hydrophobic to hydrophilic surfaces. The simulation results demonstrate the potential of the proposed framework to genuinely study complex multiphase interactions in wet granular media.
Sequentially Executed Model Evaluation Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less
Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey
2014-04-15
In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.
An ontology for component-based models of water resource systems
NASA Astrophysics Data System (ADS)
Elag, Mostafa; Goodall, Jonathan L.
2013-08-01
Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.
lazar: a modular predictive toxicology framework
Maunz, Andreas; Gütlein, Martin; Rautenberg, Micha; Vorgrimmler, David; Gebele, Denis; Helma, Christoph
2013-01-01
lazar (lazy structure–activity relationships) is a modular framework for predictive toxicology. Similar to the read across procedure in toxicological risk assessment, lazar creates local QSAR (quantitative structure–activity relationship) models for each compound to be predicted. Model developers can choose between a large variety of algorithms for descriptor calculation and selection, chemical similarity indices, and model building. This paper presents a high level description of the lazar framework and discusses the performance of example classification and regression models. PMID:23761761
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
Multi-Fidelity Framework for Modeling Combustion Instability
2016-07-27
generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor showing...generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor...of Aeronautics and Astronautics and Associate Fellow AIAA. ‡ Professor Emeritus. § Senior Scientist, Rocket Propulsion Division and Senior Member
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
A Framework for Cloudy Model Optimization and Database Storage
NASA Astrophysics Data System (ADS)
Calvén, Emilia; Helton, Andrew; Sankrit, Ravi
2018-01-01
We present a framework for producing Cloudy photoionization models of the nebular emission from novae ejecta and storing a subset of the results in SQL database format for later usage. The database can be searched for models best fitting observed spectral line ratios. Additionally, the framework includes an optimization feature that can be used in tandem with the database to search for and improve on models by creating new Cloudy models while, varying the parameters. The database search and optimization can be used to explore the structures of nebulae by deriving their properties from the best-fit models. The goal is to provide the community with a large database of Cloudy photoionization models, generated from parameters reflecting conditions within novae ejecta, that can be easily fitted to observed spectral lines; either by directly accessing the database using the framework code or by usage of a website specifically made for this purpose.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
Translation from UML to Markov Model: A Performance Modeling Framework
NASA Astrophysics Data System (ADS)
Khan, Razib Hayat; Heegaard, Poul E.
Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.
Bayesian calibration for electrochemical thermal model of lithium-ion cells
NASA Astrophysics Data System (ADS)
Tagade, Piyush; Hariharan, Krishnan S.; Basu, Suman; Verma, Mohan Kumar Singh; Kolake, Subramanya Mayya; Song, Taewon; Oh, Dukjin; Yeo, Taejung; Doo, Seokgwang
2016-07-01
Pseudo-two dimensional electrochemical thermal (P2D-ECT) model contains many parameters that are difficult to evaluate experimentally. Estimation of these model parameters is challenging due to computational cost and the transient model. Due to lack of complete physical understanding, this issue gets aggravated at extreme conditions like low temperature (LT) operations. This paper presents a Bayesian calibration framework for estimation of the P2D-ECT model parameters. The framework uses a matrix variate Gaussian process representation to obtain a computationally tractable formulation for calibration of the transient model. Performance of the framework is investigated for calibration of the P2D-ECT model across a range of temperatures (333 Ksbnd 263 K) and operating protocols. In the absence of complete physical understanding, the framework also quantifies structural uncertainty in the calibrated model. This information is used by the framework to test validity of the new physical phenomena before incorporation in the model. This capability is demonstrated by introducing temperature dependence on Bruggeman's coefficient and lithium plating formation at LT. With the incorporation of new physics, the calibrated P2D-ECT model accurately predicts the cell voltage with high confidence. The accurate predictions are used to obtain new insights into the low temperature lithium ion cell behavior.
NoSQL Based 3D City Model Management System
NASA Astrophysics Data System (ADS)
Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.
2014-04-01
To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.
A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction
NASA Astrophysics Data System (ADS)
Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf
2018-07-01
Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.
Boehm, Udo; Steingroever, Helen; Wagenmakers, Eric-Jan
2018-06-01
An important tool in the advancement of cognitive science are quantitative models that represent different cognitive variables in terms of model parameters. To evaluate such models, their parameters are typically tested for relationships with behavioral and physiological variables that are thought to reflect specific cognitive processes. However, many models do not come equipped with the statistical framework needed to relate model parameters to covariates. Instead, researchers often revert to classifying participants into groups depending on their values on the covariates, and subsequently comparing the estimated model parameters between these groups. Here we develop a comprehensive solution to the covariate problem in the form of a Bayesian regression framework. Our framework can be easily added to existing cognitive models and allows researchers to quantify the evidential support for relationships between covariates and model parameters using Bayes factors. Moreover, we present a simulation study that demonstrates the superiority of the Bayesian regression framework to the conventional classification-based approach.
NASA Astrophysics Data System (ADS)
Smith, B.
2015-12-01
In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.
Field Markup Language: biological field representation in XML.
Chang, David; Lovell, Nigel H; Dokos, Socrates
2007-01-01
With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.
Tyler Jon Smith; Lucy Amanda Marshall
2010-01-01
Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...
Digging into the corona: A modeling framework trained with Sun-grazing comet observations
NASA Astrophysics Data System (ADS)
Jia, Y. D.; Pesnell, W. D.; Bryans, P.; Downs, C.; Liu, W.; Schwartz, S. J.
2017-12-01
Images of comets diving into the low corona have been captured a few times in the past decade. Structures visible at various wavelengths during these encounters indicate a strong variation of the ambient conditions of the corona. We combine three numerical models: a global coronal model, a particle transportation model, and a cometary plasma interaction model into one framework to model the interaction of such Sun-grazing comets with plasma in the low corona. In our framework, cometary vapors are ionized via multiple channels and then captured by the coronal magnetic field. In seconds, these ions are further ionized into their highest charge state, which is revealed by certain coronal emission lines. Constrained by observations, we apply our framework to trace back to the local conditions of the ambient corona, and their spatial/time variation over a broad range of scales. Once trained by multiple stages of the comet's journey in the low corona, we illustrate how this framework can leverage these unique observations to probe the structure of the solar corona and solar wind.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
Automatic Earth observation data service based on reusable geo-processing workflow
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min
2008-12-01
A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.
Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin
2017-01-01
There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811
Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin
2017-01-01
There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.
Modules: A New Tool in the Emissions Modeling Framework
DOT National Transportation Integrated Search
2017-08-14
The Emissions Modeling Framework (EMF) is used by various organizations, including the US Environmental Protection Agency, to manage their emissions inventories, projections, and emissions modeling scenarios. Modules are a new tool under develo...
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.
2017-12-01
Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.
Modeling spray drift and runoff-related inputs of pesticides to receiving water.
Zhang, Xuyang; Luo, Yuzhou; Goh, Kean S
2018-03-01
Pesticides move to surface water via various pathways including surface runoff, spray drift and subsurface flow. Little is known about the relative contributions of surface runoff and spray drift in agricultural watersheds. This study develops a modeling framework to address the contribution of spray drift to the total loadings of pesticides in receiving water bodies. The modeling framework consists of a GIS module for identifying drift potential, the AgDRIFT model for simulating spray drift, and the Soil and Water Assessment Tool (SWAT) for simulating various hydrological and landscape processes including surface runoff and transport of pesticides. The modeling framework was applied on the Orestimba Creek Watershed, California. Monitoring data collected from daily samples were used for model evaluation. Pesticide mass deposition on the Orestimba Creek ranged from 0.08 to 6.09% of applied mass. Monitoring data suggests that surface runoff was the major pathway for pesticide entering water bodies, accounting for 76% of the annual loading; the rest 24% from spray drift. The results from the modeling framework showed 81 and 19%, respectively, for runoff and spray drift. Spray drift contributed over half of the mass loading during summer months. The slightly lower spray drift contribution as predicted by the modeling framework was mainly due to SWAT's under-prediction of pesticide mass loading during summer and over-prediction of the loading during winter. Although model simulations were associated with various sources of uncertainties, the overall performance of the modeling framework was satisfactory as evaluated by multiple statistics: for simulation of daily flow, the Nash-Sutcliffe Efficiency Coefficient (NSE) ranged from 0.61 to 0.74 and the percent bias (PBIAS) < 28%; for daily pesticide loading, NSE = 0.18 and PBIAS = -1.6%. This modeling framework will be useful for assessing the relative exposure from pesticides related to spray drift and runoff in receiving waters and the design of management practices for mitigating pesticide exposure within a watershed. Published by Elsevier Ltd.
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671
High Spatial Resolution Multi-Organ Finite Element Modeling of Ventricular-Arterial Coupling
Shavik, Sheikh Mohammad; Jiang, Zhenxiang; Baek, Seungik; Lee, Lik Chuan
2018-01-01
While it has long been recognized that bi-directional interaction between the heart and the vasculature plays a critical role in the proper functioning of the cardiovascular system, a comprehensive study of this interaction has largely been hampered by a lack of modeling framework capable of simultaneously accommodating high-resolution models of the heart and vasculature. Here, we address this issue and present a computational modeling framework that couples finite element (FE) models of the left ventricle (LV) and aorta to elucidate ventricular—arterial coupling in the systemic circulation. We show in a baseline simulation that the framework predictions of (1) LV pressure—volume loop, (2) aorta pressure—diameter relationship, (3) pressure—waveforms of the aorta, LV, and left atrium (LA) over the cardiac cycle are consistent with the physiological measurements found in healthy human. To develop insights of ventricular-arterial interactions, the framework was then used to simulate how alterations in the geometrical or, material parameter(s) of the aorta affect the LV and vice versa. We show that changing the geometry and microstructure of the aorta model in the framework led to changes in the functional behaviors of both LV and aorta that are consistent with experimental observations. On the other hand, changing contractility and passive stiffness of the LV model in the framework also produced changes in both the LV and aorta functional behaviors that are consistent with physiology principles. PMID:29551977
NASA Astrophysics Data System (ADS)
Onken, Jeffrey
This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.
ERIC Educational Resources Information Center
Wang, Shiyu; Yang, Yan; Culpepper, Steven Andrew; Douglas, Jeffrey A.
2018-01-01
A family of learning models that integrates a cognitive diagnostic model and a higher-order, hidden Markov model in one framework is proposed. This new framework includes covariates to model skill transition in the learning environment. A Bayesian formulation is adopted to estimate parameters from a learning model. The developed methods are…
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
Communication: Introducing prescribed biases in out-of-equilibrium Markov models
NASA Astrophysics Data System (ADS)
Dixit, Purushottam D.
2018-03-01
Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.
Crops in silico: A community wide multi-scale computational modeling framework of plant canopies
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.
2016-12-01
Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.
National water, food, and trade modeling framework: The case of Egypt.
Abdelkader, A; Elshorbagy, A; Tuninetti, M; Laio, F; Ridolfi, L; Fahmy, H; Hoekstra, A Y
2018-10-15
This paper introduces a modeling framework for the analysis of real and virtual water flows at national scale. The framework has two components: (1) a national water model that simulates agricultural, industrial and municipal water uses, and available water and land resources; and (2) an international virtual water trade model that captures national virtual water exports and imports related to trade in crops and animal products. This National Water, Food & Trade (NWFT) modeling framework is applied to Egypt, a water-poor country and the world's largest importer of wheat. Egypt's food and water gaps and the country's food (virtual water) imports are estimated over a baseline period (1986-2013) and projected up to 2050 based on four scenarios. Egypt's food and water gaps are growing rapidly as a result of steep population growth and limited water resources. The NWFT modeling framework shows the nexus of the population dynamics, water uses for different sectors, and their compounding effects on Egypt's food gap and water self-sufficiency. The sensitivity analysis reveals that for solving Egypt's water and food problem non-water-based solutions like educational, health, and awareness programs aimed at lowering population growth will be an essential addition to the traditional water resources development solution. Both the national and the global models project similar trends of Egypt's food gap. The NWFT modeling framework can be easily adapted to other nations and regions. Copyright © 2018. Published by Elsevier B.V.
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Modeling Philosophies and Applications
All models begin with a framework and a set of assumptions and limitations that go along with that framework. In terms of fracing and RA, there are several places where models and parameters must be chosen to complete hazard identification.
NASA Astrophysics Data System (ADS)
Mortensen, Mikael; Langtangen, Hans Petter; Wells, Garth N.
2011-09-01
Finding an appropriate turbulence model for a given flow case usually calls for extensive experimentation with both models and numerical solution methods. This work presents the design and implementation of a flexible, programmable software framework for assisting with numerical experiments in computational turbulence. The framework targets Reynolds-averaged Navier-Stokes models, discretized by finite element methods. The novel implementation makes use of Python and the FEniCS package, the combination of which leads to compact and reusable code, where model- and solver-specific code resemble closely the mathematical formulation of equations and algorithms. The presented ideas and programming techniques are also applicable to other fields that involve systems of nonlinear partial differential equations. We demonstrate the framework in two applications and investigate the impact of various linearizations on the convergence properties of nonlinear solvers for a Reynolds-averaged Navier-Stokes model.
Evidence-Based Leadership Development: The 4L Framework
ERIC Educational Resources Information Center
Scott, Shelleyann; Webber, Charles F.
2008-01-01
Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…
ERIC Educational Resources Information Center
Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle
2015-01-01
A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…
ERIC Educational Resources Information Center
Erkut, Sumru; Szalacha, Laura A.; Coll, Cynthia Garcia
2005-01-01
A theoretical framework is proposed for studying minority young men's involvement with their babies that combines the integrative model of minority youth development and a life course developmental perspective with Lamb's revised four-factor model of father involvement. This framework posits a relationship between demographic and family background…
USDA-ARS?s Scientific Manuscript database
Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...
Alternative Frameworks for the Study of Man.
ERIC Educational Resources Information Center
Markova, Ivana
1979-01-01
Two frameworks for the study of man are discussed. The Cartesian model views man as a physical object. A dialectic framework, with the emphasis on the self, grew out of nineteenth century romanticism and reflects the theories of Hegel. Both models have had an effect on social psychology and the study of interpersonal communication. (BH)
ERIC Educational Resources Information Center
Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk
2014-01-01
Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…
An introduction to the multisystem model of knowledge integration and translation.
Palmer, Debra; Kramlich, Debra
2011-01-01
Many nurse researchers have designed strategies to assist health care practitioners to move evidence into practice. While many have been identified as "models," most do not have a conceptual framework. They are unidirectional, complex, and difficult for novice research users to understand. These models have focused on empirical knowledge and ignored the importance of practitioners' tacit knowledge. The Communities of Practice conceptual framework allows for the integration of tacit and explicit knowledge into practice. This article describes the development of a new translation model, the Multisystem Model of Knowledge Integration and Translation, supported by the Communities of Practice conceptual framework.
The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework
Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob
2014-01-01
The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
Evolution of 3-D geologic framework modeling and its application to groundwater flow studies
Blome, Charles D.; Smith, David V.
2012-01-01
In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.
THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE
The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...
Jenni, Karen E.; Naftz, David L.; Presser, Theresa S.
2017-10-16
The U.S. Geological Survey, working with the Montana Department of Environmental Quality and the British Columbia Ministry of the Environment and Climate Change Strategy, has developed a conceptual modeling framework that can be used to provide structured and scientifically based input to the Lake Koocanusa Monitoring and Research Working Group as they consider potential site-specific selenium criteria for Lake Koocanusa, a transboundary reservoir located in Montana and British Columbia. This report describes that modeling framework, provides an example of how it can be applied, and outlines possible next steps for implementing the framework.
A Framework for the Optimization of Discrete-Event Simulation Models
NASA Technical Reports Server (NTRS)
Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.
1996-01-01
With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.
Conceptual models for cumulative risk assessment.
Linder, Stephen H; Sexton, Ken
2011-12-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.
Conceptual Models for Cumulative Risk Assessment
Sexton, Ken
2011-01-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317
A mixed model framework for teratology studies.
Braeken, Johan; Tuerlinckx, Francis
2009-10-01
A mixed model framework is presented to model the characteristic multivariate binary anomaly data as provided in some teratology studies. The key features of the model are the incorporation of covariate effects, a flexible random effects distribution by means of a finite mixture, and the application of copula functions to better account for the relation structure of the anomalies. The framework is motivated by data of the Boston Anticonvulsant Teratogenesis study and offers an integrated approach to investigate substantive questions, concerning general and anomaly-specific exposure effects of covariates, interrelations between anomalies, and objective diagnostic measurement.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
Toward a consistent modeling framework to assess multi-sectoral climate impacts.
Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin
2018-02-13
Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.
Integrated city as a model for a new wave urban tourism
NASA Astrophysics Data System (ADS)
Ariani, V.
2018-03-01
Cities are a major player for an urban tourism destination. Massive tourism movement for urban tourism gains competitiveness to the city with similar characteristic. The new framework model for new wave urban tourism is crucial to give more experience to the tourist and valuing for the city itself. The integrated city is the answer for creating a new model for an urban tourism destination. The purpose of this preliminary research is to define integrated city framework for urban tourism development. It provides a rationale for tourism planner pursuing an innovative approach, competitive advantages, and general urban tourism destination model. The methodology applies to this research includes desk survey, literature review and focus group discussion. A conceptual framework is proposed, discussed and exemplified. The framework model adopts a place-based approach to tourism destination and suggests an integrated city model for urban tourism development. This model is a tool for strategy making in re-invention integrated city as an urban tourism destination.
Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A
2017-06-01
Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS
This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...
Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J. A.
A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).
Graphical Means for Inspecting Qualitative Models of System Behaviour
ERIC Educational Resources Information Center
Bouwer, Anders; Bredeweg, Bert
2010-01-01
This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are combined into model fragments and scenarios.…
ERIC Educational Resources Information Center
Schweizer, Karl
2008-01-01
Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…
The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impac...
ERIC Educational Resources Information Center
Mckenna, George Tucker
2017-01-01
The purpose of this study is to determine the levels of concern of Illinois principals regarding the adoption of an evaluation system modeled after Charlotte Danielson's Framework for Teaching. Principal demographics and involvement in the use of and professional development surrounding Charlotte Danielson's Framework for Teaching were studied for…
A Response to the Review of the Community of Inquiry Framework
ERIC Educational Resources Information Center
Akyol, Zehra; Arbaugh, J. Ben; Cleveland-Innes, Marti; Garrison, D. Randy; Ice, Phil; Richardson, Jennifer C.; Swan, Karen
2009-01-01
The Community of Inquiry (CoI) framework has become a prominent model of teaching and learning in online and blended learning environments. Considerable research has been conducted which employs the framework with promising results, resulting in wide use to inform the practice of online and blended teaching and learning. For the CoI model to…
A framework to analyze emissions implications of ...
Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby
In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less
System modeling with the DISC framework: evidence from safety-critical domains.
Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda
2012-01-01
The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
A framework for predicting impacts on ecosystem services ...
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi
A bioartificial kidney device with polarized secretion of immune modulators.
Chevtchik, N V; Mihajlovic, M; Fedecostante, M; Bolhuis-Versteeg, L; Sastre Toraño, J; Masereeuw, R; Stamatialis, D
2018-05-15
The accumulation of protein-bound toxins in dialyzed patients is strongly associated with their high morbidity and mortality. The bioartificial kidney device (BAK), containing proximal tubule epithelial cells (PTEC) seeded on functionalized synthetic hollow fiber membranes (HFM), may be a powerful solution for the active removal of those metabolites. In an earlier study, we developed an upscaled BAK containing conditionally immortalized human PTEC (ciPTEC) with functional organic cationic transporter 2 (OCT2). Here, we first extended this development to a BAK device having cells with the organic anionic transporter 1 (OAT1), capable of removing anionic uremic wastes. We confirmed the quality of the ciPTEC monolayer by confocal microscopy and paracellular inulin-FITC leakage, as well as, by the active transport of anionic toxin, indoxyl sulfate (IS). Furthermore, we assessed the immune-safety of our system by measuring the production of relevant cytokines by the cells after lipopolysaccharide (LPS) stimulation. Upon LPS treatment, we observed a polarized secretion of pro-inflammatory cytokines by the cells: 10-fold higher in the extraluminal space, corresponding to the urine compartment, as compared to the intraluminal space, corresponding to the blood compartment. To the best of our knowledge, our work is the first to show this favorable cell polarization in a BAK upscaled device. This article is protected by copyright. All rights reserved.
Learning in the model space for cognitive fault diagnosis.
Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin
2014-01-01
The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.
Saunders, Carla; Crossing, Sally; Girgis, Afaf; Butow, Phyllis; Penman, Andrew
2007-01-01
The Consumers' Health Forum of Australia and the National Health and Medical Research Council has recently developed a Model Framework for Consumer and Community Participation in Health and Medical Research in order to better align health and medical research with community need, and improve the impact of research. Model frameworks may have little impact on what goes on in practice unless relevant organisations actively make use of them. Philanthropic and government bodies have reported involving consumers in more meaningful or collaborative ways of late. This paper describes how a large charity organisation, which funds a significant proportion of Australian cancer research, operationalised the model framework using a unique approach demonstrating that it is both possible and reasonable for research to be considerate of public values. PMID:17592651
NASA Astrophysics Data System (ADS)
Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.
2012-04-01
Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.
Implementing Restricted Maximum Likelihood Estimation in Structural Equation Models
ERIC Educational Resources Information Center
Cheung, Mike W.-L.
2013-01-01
Structural equation modeling (SEM) is now a generic modeling framework for many multivariate techniques applied in the social and behavioral sciences. Many statistical models can be considered either as special cases of SEM or as part of the latent variable modeling framework. One popular extension is the use of SEM to conduct linear mixed-effects…
ERIC Educational Resources Information Center
Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima
2016-01-01
In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…
The intersection of disability and healthcare disparities: a conceptual framework.
Meade, Michelle A; Mahmoudi, Elham; Lee, Shoou-Yih
2015-01-01
This article provides a conceptual framework for understanding healthcare disparities experienced by individuals with disabilities. While health disparities are the result of factors deeply rooted in culture, life style, socioeconomic status, and accessibility of resources, healthcare disparities are a subset of health disparities that reflect differences in access to and quality of healthcare and can be viewed as the inability of the healthcare system to adequately address the needs of specific population groups. This article uses a narrative method to identify and critique the main conceptual frameworks that have been used in analyzing disparities in healthcare access and quality, and evaluating those frameworks in the context of healthcare for individuals with disabilities. Specific models that are examined include the Aday and Anderson Model, the Grossman Utility Model, the Institute of Medicine (IOM)'s models of Access to Healthcare Services and Healthcare Disparities, and the Cultural Competency model. While existing frameworks advance understandings of disparities in healthcare access and quality, they fall short when applied to individuals with disabilities. Specific deficits include a lack of attention to cultural and contextual factors (Aday and Andersen framework), unrealistic assumptions regarding equal access to resources (Grossman's utility model), lack of recognition or inclusion of concepts of structural accessibility (IOM model of Healthcare Disparities) and exclusive emphasis on supply side of the healthcare equation to improve healthcare disparities (Cultural Competency model). In response to identified gaps in the literature and short-comings of current conceptualizations, an integrated model of disability and healthcare disparities is put forth. We analyzed models of access to care and disparities in healthcare to be able to have an integrated and cohesive conceptual framework that could potentially address issues related to access to healthcare among individuals with disabilities. The Model of Healthcare Disparities and Disability (MHDD) provides a framework for conceptualizing how healthcare disparities impact disability and specifically, how a mismatch between personal and environmental factors may result in reduced healthcare access and quality, which in turn may lead to reduced functioning, activity and participation among individuals with impairments and chronic health conditions. Researchers, health providers, policy makers and community advocate groups who are engaged in devising interventions aimed at reducing healthcare disparities would benefit from the discussions. Implications for Rehabilitation Evaluates the main models of healthcare disparity and disability to create an integrated framework. Provides a comprehensive conceptual model of healthcare disparity that specifically targets issues related to individuals with disabilities. Conceptualizes how personal and environmental factors interact to produce disparities in access to healthcare and healthcare quality. Recognizes and targets modifiable factors to reduce disparities between and within individuals with disabilities.
Material and morphology parameter sensitivity analysis in particulate composite materials
NASA Astrophysics Data System (ADS)
Zhang, Xiaoyu; Oskay, Caglar
2017-12-01
This manuscript presents a novel parameter sensitivity analysis framework for damage and failure modeling of particulate composite materials subjected to dynamic loading. The proposed framework employs global sensitivity analysis to study the variance in the failure response as a function of model parameters. In view of the computational complexity of performing thousands of detailed microstructural simulations to characterize sensitivities, Gaussian process (GP) surrogate modeling is incorporated into the framework. In order to capture the discontinuity in response surfaces, the GP models are integrated with a support vector machine classification algorithm that identifies the discontinuities within response surfaces. The proposed framework is employed to quantify variability and sensitivities in the failure response of polymer bonded particulate energetic materials under dynamic loads to material properties and morphological parameters that define the material microstructure. Particular emphasis is placed on the identification of sensitivity to interfaces between the polymer binder and the energetic particles. The proposed framework has been demonstrated to identify the most consequential material and morphological parameters under vibrational and impact loads.
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.
2015-03-01
We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.
Price responsiveness of demand for cigarettes: does rationality matter?
Laporte, Audrey
2006-01-01
Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson; Feng, Zhe; Plant, Robert S.
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The approach used follows the non-equilibrium statistical mechanical approach through a master equation. The aim is to represent the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and mass flux is a non-linear function of convective cell area, mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated mass flux variability under diurnally varying forcing. Besides its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to be capable of providing alternative, non-equilibrium, closure formulations for spectral mass flux parameterizations.« less
Comparison of methods for the analysis of relatively simple mediation models.
Rijnhart, Judith J M; Twisk, Jos W R; Chinapaw, Mai J M; de Boer, Michiel R; Heymans, Martijn W
2017-09-01
Statistical mediation analysis is an often used method in trials, to unravel the pathways underlying the effect of an intervention on a particular outcome variable. Throughout the years, several methods have been proposed, such as ordinary least square (OLS) regression, structural equation modeling (SEM), and the potential outcomes framework. Most applied researchers do not know that these methods are mathematically equivalent when applied to mediation models with a continuous mediator and outcome variable. Therefore, the aim of this paper was to demonstrate the similarities between OLS regression, SEM, and the potential outcomes framework in three mediation models: 1) a crude model, 2) a confounder-adjusted model, and 3) a model with an interaction term for exposure-mediator interaction. Secondary data analysis of a randomized controlled trial that included 546 schoolchildren. In our data example, the mediator and outcome variable were both continuous. We compared the estimates of the total, direct and indirect effects, proportion mediated, and 95% confidence intervals (CIs) for the indirect effect across OLS regression, SEM, and the potential outcomes framework. OLS regression, SEM, and the potential outcomes framework yielded the same effect estimates in the crude mediation model, the confounder-adjusted mediation model, and the mediation model with an interaction term for exposure-mediator interaction. Since OLS regression, SEM, and the potential outcomes framework yield the same results in three mediation models with a continuous mediator and outcome variable, researchers can continue using the method that is most convenient to them.
Sequence modelling and an extensible data model for genomic database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Peter Wei-Der
1992-01-01
The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less
Sequence modelling and an extensible data model for genomic database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Peter Wei-Der
1992-01-01
The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less
DOT National Transportation Integrated Search
2015-12-01
We develop an econometric framework for incorporating spatial dependence in integrated model systems of latent variables and multidimensional mixed data outcomes. The framework combines Bhats Generalized Heterogeneous Data Model (GHDM) with a spat...
NASA Astrophysics Data System (ADS)
Peng, B.; Guan, K.; Chen, M.
2016-12-01
Future agricultural production faces a grand challenge of higher temperature under climate change. There are multiple physiological or metabolic processes of how high temperature affects crop yield. Specifically, we consider the following major processes: (1) direct temperature effects on photosynthesis and respiration; (2) speed-up growth rate and the shortening of growing season; (3) heat stress during reproductive stage (flowering and grain-filling); (4) high-temperature induced increase of atmospheric water demands. In this work, we use a newly developed modeling framework (CLM-APSIM) to simulate the corn and soybean growth and explicitly parse the above four processes. By combining the strength of CLM in modeling surface biophysical (e.g., hydrology and energy balance) and biogeochemical (e.g., photosynthesis and carbon-nitrogen interactions), as well as that of APSIM in modeling crop phenology and reproductive stress, the newly developed CLM-APSIM modeling framework enables us to diagnose the impacts of high temperature stress through different processes at various crop phenology stages. Ground measurements from the advanced SoyFACE facility at University of Illinois is used here to calibrate, validate, and improve the CLM-APSIM modeling framework at the site level. We finally use the CLM-APSIM modeling framework to project crop yield for the whole US Corn Belt under different climate scenarios.
A framework for scalable parameter estimation of gene circuit models using structural information.
Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin
2013-07-01
Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.
A flexible framework for process-based hydraulic and water ...
Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and
When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems
NASA Astrophysics Data System (ADS)
Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz
2015-03-01
Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.
Design of a framework for modeling, integration and simulation of physiological models.
Erson, E Zeynep; Cavuşoğlu, M Cenk
2012-09-01
Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liang, Likai; Bi, Yushen
Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.
Design and Application of an Ontology for Component-Based Modeling of Water Systems
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2012-12-01
Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
ERIC Educational Resources Information Center
Psillos, D.; Tselfes, Vassilis; Kariotoglou, Petros
2004-01-01
In the present paper we propose a theoretical framework for an epistemological modelling of teaching-learning (didactical) activities, which draws on recent studies of scientific practice. We present and analyse the framework, which includes three categories: namely, Cosmos-Evidence-Ideas (CEI). We also apply this framework in order to model a…
ERIC Educational Resources Information Center
Zhao, Zhongbao
2013-01-01
This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…
A scalable delivery framework and a pricing model for streaming media with advertisements
NASA Astrophysics Data System (ADS)
Al-Hadrusi, Musab; Sarhan, Nabil J.
2008-01-01
This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
Theories and Frameworks for Online Education: Seeking an Integrated Model
ERIC Educational Resources Information Center
Picciano, Anthony G.
2017-01-01
This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…
Tsao, Liuxing; Ma, Liang
2016-11-01
Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.
Qualitative analysis of a discrete thermostatted kinetic framework modeling complex adaptive systems
NASA Astrophysics Data System (ADS)
Bianca, Carlo; Mogno, Caterina
2018-01-01
This paper deals with the derivation of a new discrete thermostatted kinetic framework for the modeling of complex adaptive systems subjected to external force fields (nonequilibrium system). Specifically, in order to model nonequilibrium stationary states of the system, the external force field is coupled to a dissipative term (thermostat). The well-posedness of the related Cauchy problem is investigated thus allowing the new discrete thermostatted framework to be suitable for the derivation of specific models and the related computational analysis. Applications to crowd dynamics and future research directions are also discussed within the paper.
Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments
The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-01-01
A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.
A penalized framework for distributed lag non-linear models.
Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G
2017-09-01
Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data
Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia
2017-01-01
Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.
A nursing-specific model of EPR documentation: organizational and professional requirements.
von Krogh, Gunn; Nåden, Dagfinn
2008-01-01
To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.
Modeling of ultrasonic processes utilizing a generic software framework
NASA Astrophysics Data System (ADS)
Bruns, P.; Twiefel, J.; Wallaschek, J.
2017-06-01
Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
Lustig, Audrey; Worner, Susan P; Pitt, Joel P W; Doscher, Crile; Stouffer, Daniel B; Senay, Senait D
2017-10-01
Natural and human-induced events are continuously altering the structure of our landscapes and as a result impacting the spatial relationships between individual landscape elements and the species living in the area. Yet, only recently has the influence of the surrounding landscape on invasive species spread started to be considered. The scientific community increasingly recognizes the need for broader modeling framework that focuses on cross-study comparisons at different spatiotemporal scales. Using two illustrative examples, we introduce a general modeling framework that allows for a systematic investigation of the effect of habitat change on invasive species establishment and spread. The essential parts of the framework are (i) a mechanistic spatially explicit model (a modular dispersal framework-MDIG) that allows population dynamics and dispersal to be modeled in a geographical information system (GIS), (ii) a landscape generator that allows replicated landscape patterns with partially controllable spatial properties to be generated, and (iii) landscape metrics that depict the essential aspects of landscape with which dispersal and demographic processes interact. The modeling framework provides functionality for a wide variety of applications ranging from predictions of the spatiotemporal spread of real species and comparison of potential management strategies, to theoretical investigation of the effect of habitat change on population dynamics. Such a framework allows to quantify how small-grain landscape characteristics, such as habitat size and habitat connectivity, interact with life-history traits to determine the dynamics of invasive species spread in fragmented landscape. As such, it will give deeper insights into species traits and landscape features that lead to establishment and spread success and may be key to preventing new incursions and the development of efficient monitoring, surveillance, control or eradication programs.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine
2017-01-01
The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...
A Conceptual Framework Curriculum Evaluation Electrical Engineering Education
ERIC Educational Resources Information Center
Imansari, Nurulita; Sutadji, Eddy
2017-01-01
This evaluation is a conceptual framework that has been analyzed in the hope that can help research related an evaluation of the curriculum. The Model of evaluation used was CIPPO model. CIPPO Model consists of "context," "input," "process," "product," and "outcomes." On the dimension of the…
A modeling framework for characterizing near-road air pollutant concentration at community scales
In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...
HexSim - A general purpose framework for spatially-explicit, individual-based modeling
HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
NASA Astrophysics Data System (ADS)
Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek
2017-04-01
There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.
Dynamic motion planning of 3D human locomotion using gradient-based optimization.
Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G
2008-06-01
Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.
Framework for non-coherent interface models at finite displacement jumps and finite strains
NASA Astrophysics Data System (ADS)
Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn
2016-05-01
This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.
ERIC Educational Resources Information Center
Amershi, Saleema; Conati, Cristina
2009-01-01
In this paper, we present a data-based user modeling framework that uses both unsupervised and supervised classification to build student models for exploratory learning environments. We apply the framework to build student models for two different learning environments and using two different data sources (logged interface and eye-tracking data).…
A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.
Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao
2017-06-16
This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.
Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling
Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah bt; Salarzadeh Jenatabadi, Hashem
2017-01-01
The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child’s food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment. PMID:28208833
Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling.
Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah Bt; Salarzadeh Jenatabadi, Hashem
2017-02-13
The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child's food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment.
NASA Astrophysics Data System (ADS)
Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah
2017-09-01
The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.
Liu, Zitao; Hauskrecht, Milos
2017-11-01
Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.
Framework of distributed coupled atmosphere-ocean-wave modeling system
NASA Astrophysics Data System (ADS)
Wen, Yuanqiao; Huang, Liwen; Deng, Jian; Zhang, Jinfeng; Wang, Sisi; Wang, Lijun
2006-05-01
In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed. The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.
Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul; ...
2017-12-20
We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul
We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less
NASA Astrophysics Data System (ADS)
Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta
2018-05-01
Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.
Merritt, Brett; Urban-Lurain, Mark; Parker, Joyce
2010-01-01
Recent science education reform has been marked by a shift away from a focus on facts toward deep, rich, conceptual understanding. This requires assessment that also focuses on conceptual understanding rather than recall of facts. This study outlines our development of a new assessment framework and tool—a taxonomy— which, unlike existing frameworks and tools, is grounded firmly in a framework that considers the critical role that models play in science. It also provides instructors a resource for assessing students' ability to reason about models that are central to the organization of key scientific concepts. We describe preliminary data arising from the application of our tool to exam questions used by instructors of a large-enrollment cell and molecular biology course over a 5-yr period during which time our framework and the assessment tool were increasingly used. Students were increasingly able to describe and manipulate models of the processes and systems being studied in this course as measured by assessment items. However, their ability to apply these models in new contexts did not improve. Finally, we discuss the implications of our results and the future directions for our research. PMID:21123691
Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille
2017-04-01
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.
Classification framework for partially observed dynamical systems
NASA Astrophysics Data System (ADS)
Shen, Yuan; Tino, Peter; Tsaneva-Atanasova, Krasimira
2017-04-01
We present a general framework for classifying partially observed dynamical systems based on the idea of learning in the model space. In contrast to the existing approaches using point estimates of model parameters to represent individual data items, we employ posterior distributions over model parameters, thus taking into account in a principled manner the uncertainty due to both the generative (observational and/or dynamic noise) and observation (sampling in time) processes. We evaluate the framework on two test beds: a biological pathway model and a stochastic double-well system. Crucially, we show that the classification performance is not impaired when the model structure used for inferring posterior distributions is much more simple than the observation-generating model structure, provided the reduced-complexity inferential model structure captures the essential characteristics needed for the given classification task.
Improved Hypoxia Modeling for Nutrient Control Decisions in the Gulf of Mexico
NASA Technical Reports Server (NTRS)
Habib, Shahid; Pickering, Ken; Tzortziou, Maria; Maninio, Antonio; Policelli, Fritz; Stehr, Jeff
2011-01-01
The Gulf of Mexico Modeling Framework is a suite of coupled models linking the deposition and transport of sediment and nutrients to subsequent bio-geo chemical processes and the resulting effect on concentrations of dissolved oxygen in the coastal waters of Louisiana and Texas. Here, we examine the potential benefits of using multiple NASA remote sensing data products within this Modeling Framework for increasing the accuracy of the models and their utility for nutrient control decisions in the Gulf of Mexico. Our approach is divided into three components: evaluation and improvement of (a) the precipitation input data (b) atmospheric constituent concentrations in EPA's air quality/deposition model and (c) the calculation of algal biomass, organic carbon and suspended solids within the water quality/eutrophication models of the framework.
A hybrid model of cell cycle in mammals.
Behaegel, Jonathan; Comet, Jean-Paul; Bernot, Gilles; Cornillon, Emilien; Delaunay, Franck
2016-02-01
Time plays an essential role in many biological systems, especially in cell cycle. Many models of biological systems rely on differential equations, but parameter identification is an obstacle to use differential frameworks. In this paper, we present a new hybrid modeling framework that extends René Thomas' discrete modeling. The core idea is to associate with each qualitative state "celerities" allowing us to compute the time spent in each state. This hybrid framework is illustrated by building a 5-variable model of the mammalian cell cycle. Its parameters are determined by applying formal methods on the underlying discrete model and by constraining parameters using timing observations on the cell cycle. This first hybrid model presents the most important known behaviors of the cell cycle, including quiescent phase and endoreplication.
Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna
2016-01-01
Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462
Predictive Models and Computational Embryology
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
Multi-Scale Multi-Domain Model | Transportation Research | NREL
framework for NREL's MSMD model. NREL's MSMD model quantifies the impacts of electrical/thermal pathway : NREL Macroscopic design factors and highly dynamic environmental conditions significantly influence the design of affordable, long-lasting, high-performing, and safe large battery systems. The MSMD framework
USDA-ARS?s Scientific Manuscript database
An improved modeling framework for capturing the effects of dynamic resistance to overland flow is developed for intensively managed landscapes. The framework builds on the WEPP model but it removes the limitations of the “equivalent” plane and static roughness assumption. The enhanced model therefo...
An Exploration of the Factors Influencing the Adoption of an IS Governance Framework
ERIC Educational Resources Information Center
Parker, Sharon L.
2013-01-01
This research explored IT governance framework adoption, leveraging established IS theories. It applied both the technology acceptance model (TAM) and the technology, organization, environment (TOE) models. The study consisted of developing a model utilizing TOE and TAM, deriving relevant hypotheses. Interviews with a group of practitioners…
Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; Xu, Tengfang; Sathaye, Jayant
2012-12-12
The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.
Order-Constrained Bayes Inference for Dichotomous Models of Unidimensional Nonparametric IRT
ERIC Educational Resources Information Center
Karabatsos, George; Sheu, Ching-Fan
2004-01-01
This study introduces an order-constrained Bayes inference framework useful for analyzing data containing dichotomous scored item responses, under the assumptions of either the monotone homogeneity model or the double monotonicity model of nonparametric item response theory (NIRT). The framework involves the implementation of Gibbs sampling to…
A geostatistical extreme-value framework for fast simulation of natural hazard events
Stephenson, David B.
2016-01-01
We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768
Robust Decision-making Applied to Model Selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M.
2012-08-06
The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define eachmore » of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.« less
Linking service quality, customer satisfaction, and behavioral intention.
Woodside, A G; Frey, L L; Daly, R T
1989-12-01
Based on the service quality and script theory literature, a framework of relationships among service quality, customer satisfaction, and behavioral intention for service purchases is proposed. Specific models are developed from the general framework and the models are applied and tested for the highly complex and divergent consumer service of overnight hospital care. Service quality, customer satisfaction, and behavioral intention data were collected from recent patients of two hospitals. The findings support the specific models and general framework. Implications for theory, service marketing, and future research are discussed.
Decision support models for solid waste management: Review and game-theoretic approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos
Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less
Structured statistical models of inductive reasoning.
Kemp, Charles; Tenenbaum, Joshua B
2009-01-01
Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet both goals and describes [corrected] 4 applications of the framework: a taxonomic model, a spatial model, a threshold model, and a causal model. Each model makes probabilistic inferences about the extensions of novel properties, but the priors for the 4 models are defined over different kinds of structures that capture different relationships between the categories in a domain. The framework therefore shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
Belcher, Wayne R.; Faunt, Claudia C.; D'Agnese, Frank A.
2002-01-01
The U.S. Geological Survey, in cooperation with the Department of Energy and other Federal, State, and local agencies, is evaluating the hydrogeologic characteristics of the Death Valley regional ground-water flow system. The ground-water flow system covers an area of about 100,000 square kilometers from latitude 35? to 38?15' North to longitude 115? to 118? West, with the flow system proper comprising about 45,000 square kilometers. The Death Valley regional ground-water flow system is one of the larger flow systems within the Southwestern United States and includes in its boundaries the Nevada Test Site, Yucca Mountain, and much of Death Valley. Part of this study includes the construction of a three-dimensional hydrogeologic framework model to serve as the foundation for the development of a steady-state regional ground-water flow model. The digital framework model provides a computer-based description of the geometry and composition of the hydrogeologic units that control regional flow. The framework model of the region was constructed by merging two previous framework models constructed for the Yucca Mountain Project and the Environmental Restoration Program Underground Test Area studies at the Nevada Test Site. The hydrologic characteristics of the region result from a currently arid climate and complex geology. Interbasinal regional ground-water flow occurs through a thick carbonate-rock sequence of Paleozoic age, a locally thick volcanic-rock sequence of Tertiary age, and basin-fill alluvium of Tertiary and Quaternary age. Throughout the system, deep and shallow ground-water flow may be controlled by extensive and pervasive regional and local faults and fractures. The framework model was constructed using data from several sources to define the geometry of the regional hydrogeologic units. These data sources include (1) a 1:250,000-scale hydrogeologic-map compilation of the region; (2) regional-scale geologic cross sections; (3) borehole information, and (4) gridded surfaces from a previous three-dimensional geologic model. In addition, digital elevation model data were used in conjunction with these data to define ground-surface altitudes. These data, properly oriented in three dimensions by using geographic information systems, were combined and gridded to produce the upper surfaces of the hydrogeologic units used in the flow model. The final geometry of the framework model is constructed as a volumetric model by incorporating the intersections of these gridded surfaces and by applying fault truncation rules to structural features from the geologic map and cross sections. The cells defining the geometry of the hydrogeologic framework model can be assigned several attributes such as lithology, hydrogeologic unit, thickness, and top and bottom altitudes.
NASA Astrophysics Data System (ADS)
Chalabi, Zaid; Milojevic, Ai; Doherty, Ruth M.; Stevenson, David S.; MacKenzie, Ian A.; Milner, James; Vieno, Massimo; Williams, Martin; Wilkinson, Paul
2017-10-01
A decision support system for evaluating UK air quality policies is presented. It combines the output from a chemistry transport model, a health impact model and other impact models within a multi-criteria decision analysis (MCDA) framework. As a proof-of-concept, the MCDA framework is used to evaluate and compare idealized emission reduction policies in four sectors (combustion in energy and transformation industries, non-industrial combustion plants, road transport and agriculture) and across six outcomes or criteria (mortality, health inequality, greenhouse gas emissions, biodiversity, crop yield and air quality legal compliance). To illustrate a realistic use of the MCDA framework, the relative importance of the criteria were elicited from a number of stakeholders acting as proxy policy makers. In the prototype decision problem, we show that reducing emissions from industrial combustion (followed very closely by road transport and agriculture) is more advantageous than equivalent reductions from the other sectors when all the criteria are taken into account. Extensions of the MCDA framework to support policy makers in practice are discussed.
Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee
2018-01-01
Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.
Development of an "Alert Framework" Based on the Practices in the Medical Front.
Sakata, Takuya; Araki, Kenji; Yamazaki, Tomoyoshi; Kawano, Koichi; Maeda, Minoru; Kushima, Muneo; Araki, Sanae
2018-05-09
At the University of Miyazaki Hospital (UMH), we have accumulated and semantically structured a vast amount of medical information since the activation of the electronic health record system approximately 10 years ago. With this medical information, we have decided to develop an alert system for aiding in medical treatment. The purpose of this investigation is to not only to integrate an alert framework into the electronic heath record system, but also to formulate a modeling method of this knowledge. A trial alert framework was developed for the staff in various occupational categories at the UMH. Based on findings of subsequent interviews, a more detailed and upgraded alert framework was constructed, resulting in the final model. Based on our current findings, an alert framework was developed with four major items. Based on the analysis of the medical practices from the trial model, it has been concluded that there are four major risk patterns that trigger the alert. Furthermore, the current alert framework contains detailed definitions which are easily substituted into the database, leading to easy implementation of the electronic health records.
NASA Astrophysics Data System (ADS)
Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo
2013-02-01
With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.
Modular modelling with Physiome standards
Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.
2016-01-01
Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
A Framework for the Study of Emotions in Organizational Contexts.
ERIC Educational Resources Information Center
Fiebig, Greg V.; Kramer, Michael W.
1998-01-01
Approaches the study of emotions in organizations holistically, based on a proposed framework. Provides descriptive data that suggests the presence of the framework's major elements. States that future examination of emotions based on this framework should assist in understanding emotions, which are frequently ignored in a rational model. (PA)
Testing a Conceptual Change Model Framework for Visual Data
ERIC Educational Resources Information Center
Finson, Kevin D.; Pedersen, Jon E.
2015-01-01
An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…
Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin
2016-01-01
Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516
Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin
2016-01-01
The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.
The Effect of Framework Design on Stress Distribution in Implant-Supported FPDs: A 3-D FEM Study
Eraslan, Oguz; Inan, Ozgur; Secilmis, Asli
2010-01-01
Objectives: The biomechanical behavior of the superstructure plays an important role in the functional longevity of dental implants. However, information about the influence of framework design on stresses transmitted to the implants and supporting tissues is limited. The purpose of this study was to evaluate the effects of framework designs on stress distribution at the supporting bone and supporting implants. Methods: In this study, the three-dimensional (3D) finite element stress analysis method was used. Three types of 3D mathematical models simulating three different framework designs for implant-supported 3-unit posterior fixed partial dentures were prepared with supporting structures. Convex (1), concave (2), and conventional (3) pontic framework designs were simulated. A 300-N static vertical occlusal load was applied on the node at the center of occlusal surface of the pontic to calculate the stress distributions. As a second condition, frameworks were directly loaded to evaluate the effect of the framework design clearly. The Solidworks/Cosmosworks structural analysis programs were used for finite element modeling/analysis. Results: The analysis of the von Mises stress values revealed that maximum stress concentrations were located at the loading areas for all models. The pontic side marginal edges of restorations and the necks of implants were other stress concentration regions. There was no clear difference among models when the restorations were loaded at occlusal surfaces. When the veneering porcelain was removed, and load was applied directly to the framework, there was a clear increase in stress concentration with a concave design on supporting implants and bone structure. Conclusions: The present study showed that the use of a concave design in the pontic frameworks of fixed partial dentures increases the von Mises stress levels on implant abutments and supporting bone structure. However, the veneering porcelain element reduces the effect of the framework and compensates for design weaknesses. PMID:20922156
CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling
NASA Astrophysics Data System (ADS)
Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.
2012-12-01
The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and their multiple threats and stressors, 5) a continental margin modeling initiative, to capture extreme oceanic and atmospheric events generating turbidity currents in the Gulf of Mexico, and 6) a CZO Focus Research Group, to develop compatibility between CSDMS architecture and protocols and Critical Zone Observatory-developed models and data.
NASA Astrophysics Data System (ADS)
Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.
2012-04-01
Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
A Framework for Curriculum Research.
ERIC Educational Resources Information Center
Kimpston, Richard D.; Rogers, Karen B.
1986-01-01
A framework for generating curriculum research is proposed from a synthesis of Dunkin and Biddle's model of teaching variables with Beauchamp's "curriculum system" planning functions. The framework systematically defines variables that delineate curriculum planning processes. (CJH)
ERIC Educational Resources Information Center
Rockinson-Szapkiw, Amanda J.; Wendt, Jillian; Wighting, Mervyn; Nisbet, Deanna
2016-01-01
The Community of Inquiry framework has been widely supported by research to provide a model of online learning that informs the design and implementation of distance learning courses. However, the relationship between elements of the CoI framework and perceived learning warrants further examination as a predictive model for online graduate student…
A revised Self- and Family Management Framework.
Grey, Margaret; Schulman-Green, Dena; Knafl, Kathleen; Reynolds, Nancy R
2015-01-01
Research on self- and family management of chronic conditions has advanced over the past 6 years, but the use of simple frameworks has hampered the understanding of the complexities involved. We sought to update our previously published model with new empirical, synthetic, and theoretical work. We used synthesis of previous studies to update the framework. We propose a revised framework that clarifies facilitators and barriers, processes, proximal outcomes, and distal outcomes of self- and family management and their relationships. We offer the revised framework as a model that can be used in studies aimed at advancing self- and family management science. The use of the framework to guide studies would allow for the design of studies that can address more clearly how self-management interventions work and under what conditions. Copyright © 2015 Elsevier Inc. All rights reserved.
Implementing Value-Based Payment Reform: A Conceptual Framework and Case Examples.
Conrad, Douglas A; Vaughn, Matthew; Grembowski, David; Marcus-Smith, Miriam
2016-08-01
This article develops a conceptual framework for implementation of value-based payment (VBP) reform and then draws on that framework to systematically examine six distinct multi-stakeholder coalition VBP initiatives in three different regions of the United States. The VBP initiatives deploy the following payment models: reference pricing, "shadow" primary care capitation, bundled payment, pay for performance, shared savings within accountable care organizations, and global payment. The conceptual framework synthesizes prior models of VBP implementation. It describes how context, project objectives, payment and care delivery strategies, and the barriers and facilitators to translating strategy into implementation affect VBP implementation and value for patients. We next apply the framework to six case examples of implementation, and conclude by discussing the implications of the case examples and the conceptual framework for future practice and research. © The Author(s) 2015.
Saenz, Juan A.; Chen, Qingshan; Ringler, Todd
2015-05-19
Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less
Delaney, Declan T.; O’Hare, Gregory M. P.
2016-01-01
No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks. PMID:27916929
Delaney, Declan T; O'Hare, Gregory M P
2016-12-01
No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.
Predictive Models and Computational Toxicology (II IBAMTOX)
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
Multimodal Speaker Diarization.
Noulas, A; Englebienne, G; Krose, B J A
2012-01-01
We present a novel probabilistic framework that fuses information coming from the audio and video modality to perform speaker diarization. The proposed framework is a Dynamic Bayesian Network (DBN) that is an extension of a factorial Hidden Markov Model (fHMM) and models the people appearing in an audiovisual recording as multimodal entities that generate observations in the audio stream, the video stream, and the joint audiovisual space. The framework is very robust to different contexts, makes no assumptions about the location of the recording equipment, and does not require labeled training data as it acquires the model parameters using the Expectation Maximization (EM) algorithm. We apply the proposed model to two meeting videos and a news broadcast video, all of which come from publicly available data sets. The results acquired in speaker diarization are in favor of the proposed multimodal framework, which outperforms the single modality analysis results and improves over the state-of-the-art audio-based speaker diarization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
A theoretical framework for psychiatric nursing practice.
Onega, L L
1991-01-01
Traditionally, specific theoretical frameworks which are congruent with psychiatric nursing practice have been poorly articulated. The purpose of this paper is to identify and discuss a philosophical base, a theoretical framework, application to psychiatric nursing, and issues related to psychiatric nursing knowledge development and practice. A philosophical framework that is likely to be congruent with psychiatric nursing, which is based on the nature of human beings, health, psychiatric nursing and reality, is identified. Aaron Antonovsky's Salutogenic Model is discussed and applied to psychiatric nursing. This model provides a helpful way for psychiatric nurses to organize their thinking processes and ultimately improve the health care services that they offer to their clients. Goal setting and nursing interventions using this model are discussed. Additionally, application of the use of Antonovsky's model is made to nursing research areas such as hardiness, uncertainty, suffering, empathy and literary works. Finally, specific issues related to psychiatric nursing are addressed.
Mechanochemical models of processive molecular motors
NASA Astrophysics Data System (ADS)
Lan, Ganhui; Sun, Sean X.
2012-05-01
Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.
Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trebotich, D
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less
Modeling complex biological flows in multi-scale systems using the APDEC framework
NASA Astrophysics Data System (ADS)
Trebotich, David
2006-09-01
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.
Short-Term Global Horizontal Irradiance Forecasting Based on Sky Imaging and Pattern Recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Feng, Cong; Cui, Mingjian
Accurate short-term forecasting is crucial for solar integration in the power grid. In this paper, a classification forecasting framework based on pattern recognition is developed for 1-hour-ahead global horizontal irradiance (GHI) forecasting. Three sets of models in the forecasting framework are trained by the data partitioned from the preprocessing analysis. The first two sets of models forecast GHI for the first four daylight hours of each day. Then the GHI values in the remaining hours are forecasted by an optimal machine learning model determined based on a weather pattern classification model in the third model set. The weather pattern ismore » determined by a support vector machine (SVM) classifier. The developed framework is validated by the GHI and sky imaging data from the National Renewable Energy Laboratory (NREL). Results show that the developed short-term forecasting framework outperforms the persistence benchmark by 16% in terms of the normalized mean absolute error and 25% in terms of the normalized root mean square error.« less
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
Framework for the Parametric System Modeling of Space Exploration Architectures
NASA Technical Reports Server (NTRS)
Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II
2008-01-01
This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.
Nowcasting Ground Magnetic Perturbations with the Space Weather Modeling Framework
NASA Astrophysics Data System (ADS)
Welling, D. T.; Toth, G.; Singer, H. J.; Millward, G. H.; Gombosi, T. I.
2015-12-01
Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized B/t predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.
Health level 7 development framework for medication administration.
Kim, Hwa Sun; Cho, Hune
2009-01-01
We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.
TLS and photogrammetry for the modeling of a historic wooden framework
NASA Astrophysics Data System (ADS)
Koehl, M.; Viale, M.
2012-04-01
The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive contents which will be integrated into an informative device. That way, the public can easily visualize the framework, manipulate the 3D model, discover the construction and the various parts of the historical wooden structure. The raw point cloud cannot be used for this kind of applications. It is thus necessary, from the data which it supplies, to create an exploitable model. Several parameters are to be taken into account: the level of detail of the 3D model, the necessary time to model all the beams, the weight of the final files and finally the type of applied texture. The idea was to implement a workflow to reconcile these various criteria, several methods were tested. This project allowed to create a range of solutions (3D models of the complete framework, virtual tour, interactive 3D models, video animations) to allow an uninitiated public to take advantage of 3D material and software often reserved for the professionals. The work was completed by the comparison between a theoretical model of the framework and a more detailed model of the current state, which allowed to make diagnoses and to study the movements of the structure in the time and to supply important data for rehabilitation and renovation operations.
A Framework for Understanding Physics Students' Computational Modeling Practices
NASA Astrophysics Data System (ADS)
Lunk, Brandon Robert
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.
Toward a unified approach to dose-response modeling in ecotoxicology.
Ritz, Christian
2010-01-01
This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.
ERIC Educational Resources Information Center
Belser, Christopher T.; Shillingford, M. Ann; Joe, J. Richelle
2016-01-01
The American School Counselor Association (ASCA) National Model and a multi-tiered system of supports (MTSS) both provide frameworks for systematically solving problems in schools, including student behavior concerns. The authors outline a model that integrates overlapping elements of the National Model and MTSS as a support for marginalized…
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Parameterization models for pesticide exposure via crop consumption.
Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier
2012-12-04
An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.
A new framework to increase the efficiency of large-scale solar power plants.
NASA Astrophysics Data System (ADS)
Alimohammadi, Shahrouz; Kleissl, Jan P.
2015-11-01
A new framework to estimate the spatio-temporal behavior of solar power is introduced, which predicts the statistical behavior of power output at utility scale Photo-Voltaic (PV) power plants. The framework is based on spatio-temporal Gaussian Processes Regression (Kriging) models, which incorporates satellite data with the UCSD version of the Weather and Research Forecasting model. This framework is designed to improve the efficiency of the large-scale solar power plants. The results are also validated from measurements of the local pyranometer sensors, and some improvements in different scenarios are observed. Solar energy.
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2012-08-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
NASA Astrophysics Data System (ADS)
Ficklin, D. L.; Abatzoglou, J. T.
2017-12-01
The spatial variability in the balance between surface runoff (Q) and evapotranspiration (ET) is critical for understanding water availability. The Budyko framework suggests that this balance is solely a function of aridity. Observed deviations from this framework for individual watersheds, however, can vary significantly, resulting in uncertainty in using the Budyko framework in ungauged catchments and under future climate and land use scenarios. Here, we model the spatial variability in the partitioning of precipitation into Q and ET using a set of climatic, physiographic, and vegetation metrics for 211 near-natural watersheds across the contiguous United States (CONUS) within Budyko's framework through the free parameter ω. Using a generalized additive model, we found that precipitation seasonality, the ratio of soil water holding capacity to precipitation, topographic slope, and the fraction of precipitation falling as snow explained 81.2% of the variability in ω. This ω model applied to the Budyko framework explained 97% of the spatial variability in long-term Q for an independent set of near-natural watersheds. The developed ω model was also used to estimate the entire CONUS surface water balance for both contemporary and mid-21st century conditions. The contemporary CONUS surface water balance compared favorably to more sophisticated land-surface modeling efforts. For mid-21st century conditions, the model simulated an increase in the fraction of precipitation used by ET across the CONUS with declines in Q for much of the eastern CONUS and mountainous watersheds across the western US. The Budyko framework using the modeled ω lends itself to an alternative approach for assessing the potential response of catchment water balance to climate change to complement other approaches.
NASA Astrophysics Data System (ADS)
Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.
2015-03-01
This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.
Frameworks for change in healthcare organisations: a formative evaluation of the NHS Change Model.
Martin, Graham P; Sutton, Elizabeth; Willars, Janet; Dixon-Woods, Mary
2013-08-01
Organisational change in complex healthcare systems is a multifaceted process. The English National Health Service recently introduced a 'Change Model' that seeks to offer an evidence-based framework for guiding change. We report findings from a formative evaluation of the NHS Change Model and make recommendations for those developing the Model and its users. The evaluation involved 28 interviews with managers and clinicians making use of the Change Model in relation to a variety of projects. Interviews were fully transcribed and were analysed using an approach based on the Framework method. Participants saw the Change Model as valuable and practically useful. Fidelity to core principles of the Model was variable: participants often altered the Model, especially when using it to orchestrate the work of others. In challenging organisational contexts, the Change Model was sometimes used to delegitimise opposition rather than identify shared purpose among different interest groups. Those guiding change may benefit from frameworks, guidance and toolkits to structure and inform their planning and activities. Participants' experiences suggested the Change Model has much potential. Further work on its design and on supporting materials may optimise the approach, but its utility rests in particular on organisational cultures that support faithful application. © The Author(s) 2013 Reprints and permissions:]br]sagepub.co.uk/journalsPermissions.nav.
Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M
2015-03-01
It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina
Bales, Jerad D.; Robbins, Jeanne C.
1999-01-01
As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina Institute of Marine Sciences, and the U.S. Geological Survey. Limitations in the modeling framework were clearly identified. These limitations formed the basis for a set of suggestions to refine the Neuse River estuary water-quality model.
Argumentation in Science Education: A Model-Based Framework
ERIC Educational Resources Information Center
Bottcher, Florian; Meisert, Anke
2011-01-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…
Technical Assistance Model for Long-Term Systems Change: Three State Examples
ERIC Educational Resources Information Center
Kasprzak, Christina; Hurth, Joicey; Lucas, Anne; Marshall, Jacqueline; Terrell, Adriane; Jones, Elizabeth
2010-01-01
The National Early Childhood Technical Assistance Center (NECTAC) Technical Assistance (TA) Model for Long-Term Systems Change (LTSC) is grounded in conceptual frameworks in the literature on systems change and systems thinking. The NECTAC conceptual framework uses a logic model approach to change developed specifically for states' infant and…
Characteristics and Conceptual Framework of the Easy-Play Model
ERIC Educational Resources Information Center
Lu, Chunlei; Steele, Kyle
2014-01-01
The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…
This research outlines a proposed Heavy-Duty Diesel Vehicle Modal Emission Modeling Framework (HDDV-MEMF) for heavy-duty diesel-powered trucks and buses. The heavy-duty vehicle modal modules being developed under this research effort, although different, should be compatible wi...
Learning situation models in a smart home.
Brdiczka, Oliver; Crowley, James L; Reignier, Patrick
2009-02-01
This paper addresses the problem of learning situation models for providing context-aware services. Context for modeling human behavior in a smart environment is represented by a situation model describing environment, users, and their activities. A framework for acquiring and evolving different layers of a situation model in a smart environment is proposed. Different learning methods are presented as part of this framework: role detection per entity, unsupervised extraction of situations from multimodal data, supervised learning of situation representations, and evolution of a predefined situation model with feedback. The situation model serves as frame and support for the different methods, permitting to stay in an intuitive declarative framework. The proposed methods have been integrated into a whole system for smart home environment. The implementation is detailed, and two evaluations are conducted in the smart home environment. The obtained results validate the proposed approach.
Hierarchical Bayesian Modeling of Fluid-Induced Seismicity
NASA Astrophysics Data System (ADS)
Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.
2017-11-01
In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.
Introduction to the special section on mixture modeling in personality assessment.
Wright, Aidan G C; Hallquist, Michael N
2014-01-01
Latent variable models offer a conceptual and statistical framework for evaluating the underlying structure of psychological constructs, including personality and psychopathology. Complex structures that combine or compare categorical and dimensional latent variables can be accommodated using mixture modeling approaches, which provide a powerful framework for testing nuanced theories about psychological structure. This special series includes introductory primers on cross-sectional and longitudinal mixture modeling, in addition to empirical examples applying these techniques to real-world data collected in clinical settings. This group of articles is designed to introduce personality assessment scientists and practitioners to a general latent variable framework that we hope will stimulate new research and application of mixture models to the assessment of personality and its pathology.
A Framework to Manage Information Models
NASA Astrophysics Data System (ADS)
Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.
2008-05-01
The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.
Metadata mapping and reuse in caBIG.
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-02-05
This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.
Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin
2014-10-06
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.
NASA Astrophysics Data System (ADS)
Falconi, Stefanie M.; Palmer, Richard N.
2017-02-01
Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.
Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin
2014-01-01
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524
A Formal Theory for Modular ERDF Ontologies
NASA Astrophysics Data System (ADS)
Analyti, Anastasia; Antoniou, Grigoris; Damásio, Carlos Viegas
The success of the Semantic Web is impossible without any form of modularity, encapsulation, and access control. In an earlier paper, we extended RDF graphs with weak and strong negation, as well as derivation rules. The ERDF #n-stable model semantics of the extended RDF framework (ERDF) is defined, extending RDF(S) semantics. In this paper, we propose a framework for modular ERDF ontologies, called modular ERDF framework, which enables collaborative reasoning over a set of ERDF ontologies, while support for hidden knowledge is also provided. In particular, the modular ERDF stable model semantics of modular ERDF ontologies is defined, extending the ERDF #n-stable model semantics. Our proposed framework supports local semantics and different points of view, local closed-world and open-world assumptions, and scoped negation-as-failure. Several complexity results are provided.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.
Pang, Wei; Coghill, George M
2015-05-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
2014-09-18
and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems
ERIC Educational Resources Information Center
Wang, Chia-Yu; Barrow, Lloyd H.
2013-01-01
The purpose of the study was to explore students' conceptual frameworks of models of atomic structure and periodic variations, chemical bonding, and molecular shape and polarity, and how these conceptual frameworks influence their quality of explanations and ability to shift among chemical representations. This study employed a purposeful sampling…
Using framework-based synthesis for conducting reviews of qualitative studies.
Dixon-Woods, Mary
2011-04-14
Framework analysis is a technique used for data analysis in primary qualitative research. Recent years have seen its being adapted to conduct syntheses of qualitative studies. Framework-based synthesis shows considerable promise in addressing applied policy questions. An innovation in the approach, known as 'best fit' framework synthesis, has been published in BMC Medical Research Methodology this month. It involves reviewers in choosing a conceptual model likely to be suitable for the question of the review, and using it as the basis of their initial coding framework. This framework is then modified in response to the evidence reported in the studies in the reviews, so that the final product is a revised framework that may include both modified factors and new factors that were not anticipated in the original model. 'Best fit' framework-based synthesis may be especially suitable in addressing urgent policy questions where the need for a more fully developed synthesis is balanced by the need for a quick answer. Please see related article: http://www.biomedcentral.com/1471-2288/11/29.
Creating an outcomes framework.
Doerge, J B
2000-01-01
Four constructs used to build a framework for outcomes management for a large midwestern tertiary hospital are described in this article. A system framework outlining a model of clinical integration and population management based in Steven Shortell's work is discussed. This framework includes key definitions of high-risk patients, target groups, populations and community. Roles for each level of population management and how they were implemented in the health care system are described. A point of service framework centered on seven dimensions of care is the next construct applied on each nursing unit. The third construct outlines the framework for role development. Three roles for nursing were created to implement strategies for target groups that are strategic disease categories; two of those roles are described in depth. The philosophy of nursing practice is centered on caring and existential advocacy. The final construct is the modification of the Dartmouth model as a common framework for outcomes. System applications of the scorecard and lessons learned in the 2-year process of implementation are shared