Publications by authors named "Chaowei Yang"

14 Publications

  • Page 1 of 1

Konzentrisches Muster des Haarwuchses bei Alopecia areata.

J Dtsch Dermatol Ges 2021 Mar;19(3):451-453

Department of Dermatology, China-Japan Friendship Hospital, Beijing, China.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ddg.14300_gDOI Listing
March 2021

Analysis of hydrochemical evolution in main discharge aquifers under mining disturbance and water source identification.

Environ Sci Pollut Res Int 2021 Jan 26. Epub 2021 Jan 26.

Peigou Coal Mine, Zhengzhou Coal Group Co., Ltd., Zhengzhou, 452382, Henan, People's Republic of China.

To discuss the hydrochemical evolution characteristics of the mining process of Peigou Coal Mine, based on the test results of 43 water samples collected at different times from three main discharge aquifers, namely, Carboniferous Taiyuan Formation limestone water (L + L water), Ordovician limestone water (including Taiyuan Formation L), and Permian main mining coal seam roof and floor sandstone water (roof and floor water), a hydrochemical evolution model of the mining disturbances since 2003 has been established. The carbonate and sulphate dissolution and pyrite oxidation in Ordovician limestone water significantly decreased and then increased in 2006, and silicate weathering was weak. The carbonate and sulphate dissolution, silicate weathering and pyrite oxidation of roof and floor sandstone water increased. At the same time, a water source identification model suitable for the Peigou Coal Mine was developed by comparing the Fisher discriminant and the BP (back propagation) neural network discriminant. The accuracy rates of Fisher discriminant and BP neural network discriminant are 81.40% and 83.72% respectively, which indicates that BP neural network is more accurate. Finally, the evolution of hydraulic connection between aquifers is analysed. We speculate that there is a fracture development channel between Ordovician limestone water and roof and floor water aquifers that is affected in 2005 by the mining disturbance. This study has significance for examining the hydrochemical evolution of groundwater in mines and acting as a guideline to prevent and control water inrushes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s11356-021-12639-wDOI Listing
January 2021

The Impact of Policy Measures on Human Mobility, COVID-19 Cases, and Mortality in the US: A Spatiotemporal Perspective.

Int J Environ Res Public Health 2021 01 23;18(3). Epub 2021 Jan 23.

Department of Geography and GeoInformation Science, George Mason University, Fairfax, VA 22030, USA.

Social distancing policies have been regarded as effective in containing the rapid spread of COVID-19. However, there is a limited understanding of policy effectiveness from a spatiotemporal perspective. This study integrates geographical, demographical, and other key factors into a regression-based event study framework, to assess the effectiveness of seven major policies on human mobility and COVID-19 case growth rates, with a spatiotemporal emphasis. Our results demonstrate that stay-at-home orders, workplace closures, and public information campaigns were effective in decreasing the confirmed case growth rate. For stay-at-home orders and workplace closures, these changes were associated with significant decreases ( < 0.05) in mobility. Public information campaigns did not see these same mobility trends, but the growth rate still decreased significantly in all analysis periods ( < 0.01). Stay-at-home orders and international/national travel controls had limited mitigation effects on the death case growth rate ( < 0.1). The relationships between policies, mobility, and epidemiological metrics allowed us to evaluate the effectiveness of each policy and gave us insight into the spatiotemporal patterns and mechanisms by which these measures work. Our analysis will provide policymakers with better knowledge regarding the effectiveness of measures in space-time disaggregation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/ijerph18030996DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7908236PMC
January 2021

Innovative use of concentrated growth factors combined with corticosteroids to treat discoid lupus erythematosus alopecia: A case report.

J Cosmet Dermatol 2020 Dec 23. Epub 2020 Dec 23.

Department of Dermatology, China-Japan Friendship Hospital, Beijing, China.

Alopecia for patients with discoid lupus erythematosus can sometimes be a refractory condition, where mixed infiltrates of T lymphocytes and histiocytes leads to destruction of hair follicles, which might cause permanent scarring. Early diagnosis and timely treatment can achieve hair regeneration and prevent further disease progression. Concentrated growth factor, a novel autologous plasma extract, contains various growth factors that could promote tissue regeneration. In this article, we report a case of cell growth factor combined with corticosteroids for the treatment of discoid lupus erythematosus alopecia. This case study concludes with satisfactory clinical effect.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/jocd.13904DOI Listing
December 2020

Targetoid pattern of hair regrowth in alopecia areata.

J Dtsch Dermatol Ges 2021 Mar 30;19(3):451-453. Epub 2020 Oct 30.

Department of Dermatology, China-Japan Friendship Hospital, Beijing, China.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ddg.14300DOI Listing
March 2021

Individual-Level Fatality Prediction of COVID-19 Patients Using AI Methods.

Front Public Health 2020 30;8:587937. Epub 2020 Sep 30.

Department of Geography and Geoinformation Science, George Mason University, Fairfax, VA, United States.

The global covid-19 pandemic puts great pressure on medical resources worldwide and leads healthcare professionals to question which individuals are in imminent need of care. With appropriate data of each patient, hospitals can heuristically predict whether or not a patient requires immediate care. We adopted a deep learning model to predict fatality of individuals tested positive given the patient's underlying health conditions, age, sex, and other factors. As the allocation of resources toward a vulnerable patient could mean the difference between life and death, a fatality prediction model serves as a valuable tool to healthcare workers in prioritizing resources and hospital space. The models adopted were evaluated and refined using the metrics of accuracy, specificity, and sensitivity. After data preprocessing and training, our model is able to predict whether a covid-19 confirmed patient is likely to be dead or not, given their information and disposition. The metrics between the different models are compared. Results indicate that the deep learning model outperforms other machine learning models to solve this rare event prediction problem.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fpubh.2020.587937DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7556112PMC
September 2020

Spatiotemporal analysis of medical resource deficiencies in the U.S. under COVID-19 pandemic.

PLoS One 2020 14;15(10):e0240348. Epub 2020 Oct 14.

NSF Spatiotemporal Innovation Center, George Mason University, Fairfax, VA, United States of America.

Coronavirus disease 2019 (COVID-19) was first identified in December 2019 in Wuhan, China as an infectious disease, and has quickly resulted in an ongoing pandemic. A data-driven approach was developed to estimate medical resource deficiencies due to medical burdens at county level during the COVID-19 pandemic. The study duration was mainly from February 15, 2020 to May 1, 2020 in the U.S. Multiple data sources were used to extract local population, hospital beds, critical care staff, COVID-19 confirmed case numbers, and hospitalization data at county level. We estimated the average length of stay from hospitalization data at state level, and calculated the hospitalized rate at both state and county level. Then, we developed two medical resource deficiency indices that measured the local medical burden based on the number of accumulated active confirmed cases normalized by local maximum potential medical resources, and the number of hospitalized patients that can be supported per ICU bed per critical care staff, respectively. Data on medical resources, and the two medical resource deficiency indices are illustrated in a dynamic spatiotemporal visualization platform based on ArcGIS Pro Dashboards. Our results provided new insights into the U.S. pandemic preparedness and local dynamics relating to medical burdens in response to the COVID-19 pandemic.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0240348PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7556467PMC
October 2020

Spatiotemporal impacts of COVID-19 on air pollution in California, USA.

Sci Total Environ 2021 Jan 10;750:141592. Epub 2020 Aug 10.

NSF Spatiotemporal Innovation Center, George Mason Univ., Fairfax, VA 22030, USA; Department of Geography and GeoInformation Science, George Mason Univ., Fairfax, VA 22030, USA. Electronic address:

Various recent studies have shown that societal efforts to mitigate (e.g. "lockdown") the outbreak of the 2019 coronavirus disease (COVID-19) caused non-negligible impacts on the environment, especially air quality. To examine if interventional policies due to COVID-19 have had a similar impact in the US state of California, this paper investigates the spatiotemporal patterns and changes in air pollution before, during and after the lockdown of the state, comparing the air quality measurements in 2020 with historical averages from 2015 to 2019. Through time series analysis, a sudden drop and uptick of air pollution are found around the dates when shutdown and reopening were ordered, respectively. The spatial patterns of nitrogen dioxide (NO) tropospheric vertical column density (TVCD) show a decreasing trend over the locations of major powerplants and an increasing trend over residential areas near interactions of national highways. Ground-based observations around California show a 38%, 49%, and 31% drop in the concentration of NO, carbon monoxide (CO) and particulate matter 2.5 (PM) during the lockdown (March 19-May 7) compared to before (January 26-March 18) in 2020. These are 16%, 25% and 19% sharper than the means of the previous five years in the same periods, respectively. Our study offers evidence of the environmental impact introduced by COVID-19, and insight into related economic influences.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.scitotenv.2020.141592DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7416771PMC
January 2021

High operating temperature pBn barrier mid-wavelength infrared photodetectors and focal plane array based on InAs/InAsSb strained layer superlattices.

Opt Express 2020 Jun;28(12):17611-17619

Improving the operation temperature of the focal plane array (FPA) imagers is critical in meeting the demands to reduce the size, weight, and power (SWaP) for mid-infrared detection systems. In this work, we report the demonstration of a 15 µm-pitch 640×512 middle-format pBn FPA device with a 50% cutoff wavelength of 4.8 µm based on short period of InAs/InAsSb-based "Ga-free" type-II strained-layer superlattices, which achieves a high operating temperature (HOT) reaching 185 K. The pBn FPA exhibits a mean noise equivalent differential temperature (NETD) of 39.5 mK and an operability of 99.6% by using f/2.0 optics for a 300 K background at 150 K. The mean quantum efficiency is 57.6% without antireflection coating and dark current density is 5.39×10 A/cm at an operation bias of -400 mV, by which the mean specific detectivity(D*) is calculated as high as 4.43×10 cm.Hz/W.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1364/OE.395770DOI Listing
June 2020

Improving the Non-Hydrostatic Numerical Dust Model by Integrating Soil Moisture and Greenness Vegetation Fraction Data with Different Spatiotemporal Resolutions.

PLoS One 2016 9;11(12):e0165616. Epub 2016 Dec 9.

NSF Spatiotemporal Innovation Center and Department of Geography and Geoinformation Science, George Mason University, Fairfax, VA, United States of America.

Dust storms are devastating natural disasters that cost billions of dollars and many human lives every year. Using the Non-Hydrostatic Mesoscale Dust Model (NMM-dust), this research studies how different spatiotemporal resolutions of two input parameters (soil moisture and greenness vegetation fraction) impact the sensitivity and accuracy of a dust model. Experiments are conducted by simulating dust concentration during July 1-7, 2014, for the target area covering part of Arizona and California (31, 37, -118, -112), with a resolution of ~ 3 km. Using ground-based and satellite observations, this research validates the temporal evolution and spatial distribution of dust storm output from the NMM-dust, and quantifies model error using measurements of four evaluation metrics (mean bias error, root mean square error, correlation coefficient and fractional gross error). Results showed that the default configuration of NMM-dust (with a low spatiotemporal resolution of both input parameters) generates an overestimation of Aerosol Optical Depth (AOD). Although it is able to qualitatively reproduce the temporal trend of the dust event, the default configuration of NMM-dust cannot fully capture its actual spatial distribution. Adjusting the spatiotemporal resolution of soil moisture and vegetation cover datasets showed that the model is sensitive to both parameters. Increasing the spatiotemporal resolution of soil moisture effectively reduces model's overestimation of AOD, while increasing the spatiotemporal resolution of vegetation cover changes the spatial distribution of reproduced dust storm. The adjustment of both parameters enables NMM-dust to capture the spatial distribution of dust storms, as well as reproducing more accurate dust concentration.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165616PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5147792PMC
July 2017

Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation.

PLoS One 2016 4;11(4):e0152250. Epub 2016 Apr 4.

Yunnan Provincial Geomatics Center, Kunming, Yunnan, China.

Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical modeling.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0152250PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4820278PMC
August 2016

Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

PLoS One 2015 5;10(3):e0116781. Epub 2015 Mar 5.

NSF Spatiotemporal Innovation Center, George Mason University, Fairfax, VA, United States of America; Department of Computer Science, University of Texas-Austin, Austin, Texas, United States of America.

Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0116781PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4351198PMC
February 2016

A service brokering and recommendation mechanism for better selecting cloud services.

PLoS One 2014 29;9(8):e105297. Epub 2014 Aug 29.

NSF Spatiotemporal Innovation Center, George Mason University, Fairfax, Virginia, United States of America.

Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0105297PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4149509PMC
May 2015

Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

Proc Natl Acad Sci U S A 2011 Apr 28;108(14):5498-503. Epub 2011 Mar 28.

Joint Center for Intelligent Spatial Computing, College of Science, George Mason University, Fairfax, VA 22030-4444, USA.

Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1073/pnas.0909315108DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3078382PMC
April 2011