Yajun Mei

Yajun Mei

Yajun Mei

Scroll

Professor of Biostatistics

Professional overview

Yajun Mei is a Professor of Biostatistics at NYU/GPH, starting from July 1, 2024. He received the B.S. degree in Mathematics from Peking University, Beijing, China, in 1996, and the Ph.D. degree in Mathematics with a minor in Electrical Engineering from the California Institute of Technology, Pasadena, CA, USA, in 2003. He was a Postdoc in Biostatistics in the renowned Fred Hutch Cancer Center in Seattle, WA during 2003 and 2005.  Prior to joining NYU, Dr. Mei was an Assistant/Associate/Full Professor in H. Milton Stewart School of Industrial and Systems Engineering at the Georgia Institute of Technology, Atlanta, GA for 18 years from 2006 to 2024, and had been a co-director of Biostatistics, Epidemiology, and Study Design (BERD) of Georgia CTSA since 2018.  

Dr. Mei’s research interests are statistics, machine learning, and data science, and their applications in biomedical science and public health, particularly, streaming data analysis, sequential decision/design, change-point problems, precision/personalized medicine, hot-spots detection for infectious diseases, longitudinal data analysis, bioinformatics, and clinical trials. His work has received several recognitions including Abraham Wald Prizes in Sequential Analysis in both 2009 and 2024, NSF CAREER Award in 2010, an elected Fellow of American Statistical Association (ASA) in 2023, and multiple best paper awards.

Education

BS, Mathematics, Peking University
PhD, Mathematics, California Institute of Technology

Honors and awards

Fellow of American Statistical Association (2023)
Star Research Achievement Award, 2021 Virtual Critical Care Congress (2021)
Best Paper Competition Award, Quality, Statistics & Reliability of INFORMS (2020)
Bronze Snapshot Award, Society of Critical Care Medicine (2019)
NSF Career Award
Thank a Teacher Certificate, Center for Teaching and Learning (2011201220162020202120222023)
Abraham Wald Prize (2009)
Best Paper Award, 11th International Conference on Information Fusion (2008)
New Researcher Fellow, Statistical and Applied Mathematical Sciences Institute (2005)
Fred Hutchinson SPAC Travel Award to attend 2005 Joint Statistical Meetings, Minneapolis, MN (2005)
Travel Award to 8th New Researchers Conference, Minneapolis, MN (2005)
Travel Award to IEEE International Symposium on Information Theory, Chicago, IL (2004)
Travel Award to IPAM workshop on inverse problem, UCLA, Los Angeles, CA (2003)
Fred Hutchinson SPAC Course Scholarship (2003)
Travel Award to the SAMSI workshop on inverse problem, Research Triangular Park, NC (2002)

Publications

Publications

CSSQ : a ChIP-seq signal quantifier pipeline

Kumar, A., Hu, M. Y., Mei, Y., & Fan, Y. (n.d.).

Publication year

2023

Journal title

Frontiers in Cell and Developmental Biology

Volume

11
Abstract
Abstract
Chromatin immunoprecipitation followed by sequencing (ChIP-seq) has revolutionized the studies of epigenomes and the massive increase in ChIP-seq datasets calls for robust and user-friendly computational tools for quantitative ChIP-seq. Quantitative ChIP-seq comparisons have been challenging due to noisiness and variations inherent to ChIP-seq and epigenomes. By employing innovative statistical approaches specially catered to ChIP-seq data distribution and sophisticated simulations along with extensive benchmarking studies, we developed and validated CSSQ as a nimble statistical analysis pipeline capable of differential binding analysis across ChIP-seq datasets with high confidence and sensitivity and low false discovery rate with any defined regions. CSSQ models ChIP-seq data as a finite mixture of Gaussians faithfully that reflects ChIP-seq data distribution. By a combination of Anscombe transformation, k-means clustering, estimated maximum normalization, CSSQ minimizes noise and bias from experimental variations. Further, CSSQ utilizes a non-parametric approach and incorporates comparisons under the null hypothesis by unaudited column permutation to perform robust statistical tests to account for fewer replicates of ChIP-seq datasets. In sum, we present CSSQ as a powerful statistical computational pipeline tailored for ChIP-seq data quantitation and a timely addition to the tool kits of differential binding analysis to decipher epigenomes.

Decentralized multihypothesis sequential detection

Wang, Y., & Mei, Y. (n.d.).

Publication year

2010

Page(s)

1393-1397
Abstract
Abstract
This article is concerned with decentralized sequential testing of multiple hypotheses. In a sensor network system with limited local memory, raw observations are observed at the local sensors, and quantized into binary sensor messages that are sent to a fusion center, which makes a final decision. It is assumed that the raw sensor observations are distributed according to a set of M ≥ 2 specified distributions, and the fusion center has to utilize quantized sensor messages to decide which one is the true distribution. Asymptotically Bayes tests are offered for decentralized multihypothesis sequential detection by combining three existing methodologies together: tandem quantizers, unambiguous likelihood quantizers, and randomized quantizers.

Decentralized two-sided sequential tests for a normal mean

Wang, V., & Mei, Y. (n.d.).

Publication year

2009

Page(s)

2408-2412
Abstract
Abstract
This article is concerned with decentralized sequential testing of a normal mean μ with two-sided alternatives. It is assumed that in a single-sensor network system with limited local memory, i.i.d, normal raw observations are observed at the local sensor, and quantized into binary messages that are sent to the fusion center, which makes a final decision between the null hypothesis Ho : μ = 0 and the alternative hypothesis HI : μ == ±1. We propose a decentralized sequential test using the idea of tandem quantizers (or equivalently, a oneshot feedback). Surprisingly, our proposed test only uses the quantizers of the form I(Xn ≥ λ), but it is shown to be asymptotically Bayes. Moreover, by adopting the principle of invariance, we also investigate decentralized invariant tests with the stationary quantizers of the form I (|Xn| ≤ λ), and show that λ = 0.5 only leads to a suboptimal decentralized invariant sequential test. Numerical simulations are conducted to support our arguments.

Differentially private change-point detection

Cummings, R., Krehbiel, S., Mei, Y., Tuo, R., & Zhang, W. (n.d.).

Publication year

2018

Journal title

Advances in Neural Information Processing Systems

Volume

2018-December

Page(s)

10825-10834
Abstract
Abstract
The change-point detection problem seeks to identify distributional changes at an unknown change-point k in a stream of data. This problem appears in many important practical settings involving personal data, including biosurveillance, fault detection, finance, signal detection, and security systems. The field of differential privacy offers data analysis tools that provide powerful worst-case privacy guarantees. We study the statistical problem of change-point detection through the lens of differential privacy. We give private algorithms for both online and offline change-point detection, analyze these algorithms theoretically, and provide empirical validation of our results.

Directional false discovery rate control in large-scale multiple comparisons

Liang, W., Xiang, D., Mei, Y., & Li, W. (n.d.).

Publication year

2024

Journal title

Journal of Applied Statistics

Volume

51

Issue

15

Page(s)

3195-3214
Abstract
Abstract
The advance of high-throughput biomedical technology makes it possible to access massive measurements of gene expression levels. An important statistical issue is identifying both under-expressed and over-expressed genes for a disease. Most existing multiple-testing procedures focus on selecting only the non-null or significant genes without further identifying their expression type. Only limited methods are designed for the directional problem, and yet they fail to separately control the numbers of falsely discovered over-expressed and under-expressed genes with only a unified index combining all the false discoveries. In this paper, based on a three-classification multiple testing framework, we propose a practical data-driven procedure to control separately the two directions of false discoveries. The proposed procedure is theoretically valid and optimal in the sense that it maximizes the expected number of true discoveries while controlling the false discovery rates for under-expressed and over-expressed genes simultaneously. The procedure allows different nominal levels for the two directions, exhibiting high flexibility in practice. Extensive numerical results and analysis of two large-scale genomic datasets show the effectiveness of our procedure.

Discussion on "Change-Points : From Sequential Detection to Biology and Back" by David O. Siegmund

Mei, Y. (n.d.).

Publication year

2013

Journal title

Sequential Analysis

Volume

32

Issue

1

Page(s)

32-35
Abstract
Abstract
In his interesting paper, Professor Siegmund illustrates that the problem formulations and methodologies are generally transferable between off-line and on-line settings of change-point problems. In our discussion of his paper, we echo his thoughts with our own experiences.

Discussion on "Quickest detection problems : Fifty years later" by Albert N. Shiryaev

Mei, Y. (n.d.).

Publication year

2010

Journal title

Sequential Analysis

Volume

29

Issue

4

Page(s)

410-414
Abstract
Abstract
In his interesting article Professor Shiryaev reviewed how he got motivated from real-applications of target detection in radar systems to develop sequential changepoint detection theory and also described different approaches to formulate mathematical problems. In our discussion of his article we focus on a couple of new real-world applications of sequential change-point detection and some new challenges as well.

Discussion on “Sequential detection/isolation of abrupt changes” by Igor V. Nikiforov

Liu, K., & Mei, Y. (n.d.).

Publication year

2016

Journal title

Sequential Analysis

Volume

35

Issue

3

Page(s)

316-319
Abstract
Abstract
In this interesting article, Professor Nikiforov reviewed the current state of quickest change detection/isolation problem. In our discussion of his article we focus on the concerns and the opportunities of the subfield of quickest change detection or, more generally, sequential methodologies, in the modern information age.

Does intrathecal nicardipine for cerebral vasospasm following subarachnoid hemorrhage correlate with reduced delayed cerebral ischemia? A retrospective propensity score-based analysis

Sadan, O., Waddel, H., Moore, R., Feng, C., Feng, C., Mei, Y., Pearce, D. G., Kraft, J., Pimentel, C., Mathew, S., Akbik, F., Ameli, P., Taylor, A., Danyluk, L., Martin, K. S., Garner, K., Kolenda, J., Pujari, A., Asbury, W., … Samuels, O. (n.d.).

Publication year

2022

Journal title

Journal of Neurosurgery

Volume

136

Issue

1

Page(s)

115-124
Abstract
Abstract
OBJECTIVE Cerebral vasospasm and delayed cerebral ischemia (DCI) contribute to poor outcome following subarachnoid hemorrhage (SAH). With the paucity of effective treatments, the authors describe their experience with intrathecal (IT) nicardipine for this indication. METHODS Patients admitted to the Emory University Hospital neuroscience ICU between 2012 and 2017 with nontraumatic SAH, either aneurysmal or idiopathic, were included in the analysis. Using a propensity-score model, this patient cohort was compared to patients in the Subarachnoid Hemorrhage International Trialists (SAHIT) repository who did not receive IT nicardipine. The primary outcome was DCI. Secondary outcomes were long-term functional outcome and adverse events. RESULTS The analysis included 1351 patients, 422 of whom were diagnosed with cerebral vasospasm and treated with IT nicardipine. When compared with patients with no vasospasm (n = 859), the treated group was significantly younger (mean age 51.1 ± 12.4 years vs 56.7 ± 14.1 years, p < 0.001), had a higher World Federation of Neurosurgical Societies score and modified Fisher grade, and were more likely to undergo clipping of the ruptured aneurysm as compared to endovascular treatment (30.3% vs 11.3%, p < 0.001). Treatment with IT nicardipine decreased the daily mean transcranial Doppler velocities in 77.3% of the treated patients. When compared to patients not receiving IT nicardipine, treatment was not associated with an increased rate of bacterial ventriculitis (3.1% vs 2.7%, p > 0.1), yet higher rates of ventriculoperitoneal shunting were noted (19.9% vs 8.8%, p < 0.01). In a propensity score comparison to the SAHIT database, the odds ratio (OR) to develop DCI with IT nicardipine treatment was 0.61 (95% confidence interval [CI] 0.44-0.84), and the OR to have a favorable functional outcome (modified Rankin Scale score ≤ 2) was 2.17 (95% CI 1.61-2.91). CONCLUSIONS IT nicardipine was associated with improved outcome and reduced DCI compared with propensitymatched controls. There was an increased need for permanent CSF diversion but no other safety issues. These data should be considered when selecting medications and treatments to study in future randomized controlled clinical trials for SAH.

Early detection of a change in poisson rate after accounting for population size effects

Mei, Y., Won Han, S., & Tsui, K. L. (n.d.).

Publication year

2011

Journal title

Statistica Sinica

Volume

21

Issue

2

Page(s)

597-624
Abstract
Abstract
Motivated by applications in bio and syndromic surveillance, this article is concerned with the problem of detecting a change in the mean of Poisson distributions after taking into account the effects of population size. The family of generalized likelihood ratio (GLR) schemes is proposed and its asymptotic optimality properties are established under the classical asymptotic setting. However, numerical simulation studies illustrate that the GLR schemes are at times not as efficient as two families of ad-hoc schemes based on either the weighted likelihood ratios or the adaptive threshold method that adjust the effects of population sizes. To explain this, a further asymptotic optimality analysis is developed under a new asymptotic setting that is more suitable to our finite-sample numerical simulations. In addition, we extend our approaches to a general setting with arbitrary probability distributions, as well as to the continuous-time setting involving the multiplicative intensity models for Poisson processes, but further research is needed.

Editorial : Mathematical Fundamentals of Machine Learning

Glickenstein, D., Hamm, K., Huo, X., Mei, Y., & Stoll, M. (n.d.).

Publication year

2021

Journal title

Frontiers in Applied Mathematics and Statistics

Volume

7
Abstract
Abstract
~

Editorial to the special issue : modern streaming data analytics

Mei, Y., Bartroff, J., Chen, J., Fellouris, G., & Zhang, R. (n.d.).

Publication year

2023

Journal title

Journal of Applied Statistics

Volume

50

Issue

14

Page(s)

2857-2861
Abstract
Abstract
~

Effect of bivariate data's correlation on sequential tests of circular error probability

Li, Y., & Mei, Y. (n.d.).

Publication year

2016

Journal title

Journal of Statistical Planning and Inference

Volume

171

Page(s)

99-114
Abstract
Abstract
The problem of evaluating a military or GPS/GSM system's precision quality is considered in this article, where one sequentially observes bivariate normal data (Xi, Yi)'s and wants to test hypotheses on the circular error probability (CEP) or the probability of nonconforming, i.e., the probabilities of the system hitting or missing a pre-specified disk target. In such a problem, we first consider a sequential probability ratio test (SPRT) developed under the erroneous assumption of the correlation coefficient ρ=0, and investigate its properties when the true ρ≠0. It was shown that at least one of the Type I and Type II error probabilities would be larger than the required ones if the true ρ≠0, and for the detailed effects, exp-2≈0.1353 turns out to be a critical value for the hypothesized probability of nonconforming. Moreover, we propose several sequential tests when the correlation coefficient ρ is unknown, and among these tests, the method of generalized sequential likelihood ratio test (GSLRT) in Bangdiwala (1982) seems to work well.

Efficient scalable schemes for monitoring a large number of data streams

Mei, Y., & Mei, Y. (n.d.).

Publication year

2010

Journal title

Biometrika

Volume

97

Issue

2

Page(s)

419-433
Abstract
Abstract
The sequential changepoint detection problem is studied in the context of global online monitoring of a large number of independent data streams. We are interested in detecting an occurring event as soon as possible, but we do not know when the event will occur, nor do we know which subset of data streams will be affected by the event. A family of scalable schemes is proposed based on the sum of the local cumulative sum, cusum, statistics from each individual data stream, and is shown to asymptotically minimize the detection delays for each and every possible combination of affected data streams, subject to the global false alarm constraint. The usefulness and limitations of our asymptotic optimality results are illustrated by numerical simulations and heuristic arguments. The Appendices contain a probabilistic result on the first epoch to simultaneous record values for multiple independent random walks.

Efficient Sequential UCB-based Hungarian Algorithm for Assignment Problems

Shi, Y., & Mei, Y. (n.d.).

Publication year

2022
Abstract
Abstract
The assignment problem has many real-world applications such as allocations of agents and tasks for optimal utility gain. While it has been well-studied in the optimization literature when the underlying utility between every pair of agent and task is known, research is limited when the utilities are unknown and need to be learned from data on the fly. In this work, motivated by the mentor-mentee matching application in U.S. universities, we develop an efficient sequential assignment algorithm, with the objective of nearly maximizing the overall utility simultaneously for each time. Our proposed algorithm is to use stochastic binary bandit feedback to estimate the unknown utilities through the logistic regression, and then to combine the Upper Confidence Bound (UCB) method in the multi-armed bandit problem with the Hungarian algorithm in the assignment problem. We derive the theoretical bounds of our algorithm for both the estimation error and the total regret, and numerical studies are conducted to illustrate the usefulness of our algorithm.

Glucose Variability as Measured by Inter-measurement Percentage Change is Predictive of In-patient Mortality in Aneurysmal Subarachnoid Hemorrhage

Sadan, O., Feng, C., Feng, C., Vidakovic, B., Mei, Y., Martin, K., Samuels, O., & Hall, C. L. (n.d.).

Publication year

2020

Journal title

Neurocritical Care

Volume

33

Issue

2

Page(s)

458-467
Abstract
Abstract
Background: Critically ill aneurysmal subarachnoid hemorrhage (aSAH) patients suffer from systemic complications at a high rate. Hyperglycemia is a common intensive care unit (ICU) complication and has become a focus after aggressive glucose management was associated with improved ICU outcomes. Subsequent research has suggested that glucose variability, not a specific blood glucose range, may be a more appropriate clinical target. Glucose variability is highly correlated to poor outcomes in a wide spectrum of critically ill patients. Here, we investigate the changes between subsequent glucose values termed “inter-measurement difference,” as an indicator of glucose variability and its association with outcomes in patients with aSAH. Methods: All SAH admissions to a single, tertiary referral center between 2002 and 2016 were screened. All aneurysmal cases who had more than 2 glucose measurements were included (n = 2451). We calculated several measures of variability, including simple variance, the average consecutive absolute change, average absolute change by time difference, within subject variance, median absolute deviation, and average or median consecutive absolute percentage change. Predictor variables also included admission Hunt and Hess grade, age, gender, cardiovascular risk factors, and surgical treatment. In-patient mortality was the main outcome measure. Results: In a multiple regression analysis, nearly all forms of glucose variability calculations were found to be correlated with in-patient mortality. The consecutive absolute percentage change, however, was most predictive: OR 5.2 [1.4–19.8, CI 95%] for percentage change and 8.8 [1.8–43.6] for median change, when controlling for the defined predictors. Survival to ICU discharge was associated with lower glucose variability (consecutive absolute percentage change 17% ± 9%) compared with the group that did not survive to discharge (20% ± 15%, p ' 0.01). Interestingly, this finding was not significant in patients with pre-admission poorly controlled diabetes as indicated by HbA1c (OR 0.45 [0.04–7.18], by percentage change). The effect is driven mostly by non-diabetic patients or those with well-controlled diabetes. Conclusions: Reduced glucose variability is highly correlated with in-patient survival and long-term mortality in aSAH patients. This finding was observed in the non-diabetic and well-controlled diabetic patients, suggesting a possible benefit for personalized glucose targets based on baseline HbA1c and minimizing variability. The inter-measure percentage change as an indicator of glucose variability is not only predictive of outcome, but is an easy-to-use tool that could be implemented in future clinical trials.

Hot-spots detection in count data by Poisson assisted smooth sparse tensor decomposition

Zhao, Y., Huo, X., & Mei, Y. (n.d.).

Publication year

2023

Journal title

Journal of Applied Statistics

Volume

50

Issue

14

Page(s)

2999-3029
Abstract
Abstract
Count data occur widely in many bio-surveillance and healthcare applications, e.g. the numbers of new patients of different types of infectious diseases from different cities/counties/states repeatedly over time, say, daily/weekly/monthly. For this type of count data, one important task is the quick detection and localization of hot-spots in terms of unusual infectious rates so that we can respond appropriately. In this paper, we develop a method called Poisson assisted Smooth Sparse Tensor Decomposition (PoSSTenD), which not only detect when hot-spots occur but also localize where hot-spots occur. The main idea of our proposed PoSSTenD method is articulated as follows. First, we represent the observed count data as a three-dimensional tensor including (1) a spatial dimension for location patterns, e.g. different cities/countries/states; (2) a temporal domain for time patterns, e.g. daily/weekly/monthly; (3) a categorical dimension for different types of data sources, e.g. different types of diseases. Second, we fit this tensor into a Poisson regression model, and then we further decompose the infectious rate into two components: smooth global trend and local hot-spots. Third, we detect when hot-spots occur by building a cumulative sum (CUSUM) control chart and localize where hot-spots occur by their LASSO-type sparse estimation. The usefulness of our proposed methodology is validated through numerical simulation studies and a real-world dataset, which records the annual number of 10 different infectious diseases from 1993 to 2018 for 49 mainland states in the United States.

Impact of compensation coefficients on active sequential change point detection

Xu, Q., Mei, Y., & Shi, J. (n.d.).

Publication year

2025

Journal title

Sequential Analysis
Abstract
Abstract
Under a general setting of active sequential change point detection problems, there are p local streams in a system but we are only able to take observations from q out of these p local streams at each time instant owing to the sampling control constraint. At some unknown change time, an undesired event occurs to the system and changes the local distributions from f to g for a subset of s unknown local streams. The objective is determining how to adaptively sample local streams and decide when to raise a global alarm, so that we can detect the correct change as quickly as possible subject to the false alarm constraint. One efficient algorithm is the TRAS algorithm proposed in Liu et al. (2015), which incorporates an idea of compensation coefficients for unobserved data streams. However, it is unclear how to choose the compensation coefficients suitably from a theoretical point of view to balance the trade-off between the detection delay and false alarm. In this article, we investigate the impact of compensation coefficients on the TRAS algorithm. Our main contributions are twofold. On the one hand, under the general setting, we prove that if the compensation coefficient is larger than (Formula presented.) where I(f, g) is the Kullback-Leibler divergence, then the TRAS algorithm is suboptimal in the sense of having too large detection delays. On the other hand, under the special case of (Formula presented.) if the compensation coefficient is small enough, then the TRAS algorithm is efficient to detect when the change occurs at time ν = 0. Though it remains an open problem to develop general asymptotic optimality theorems, our results shed lights how to tune compensation coefficients suitably in real-world applications, and extensive numerical studies are conducted to validate our results.

Implicit Regularization Properties of Variance Reduced Stochastic Mirror Descent

Luo, Y., Huo, X., & Mei, Y. (n.d.).

Publication year

2022

Page(s)

696-701
Abstract
Abstract
In machine learning and statistical data analysis, we often run into objective function that is a summation: the number of terms in the summation possibly is equal to the sample size, which can be enormous. In such a setting, the stochastic mirror descent (SMD) algorithm is a numerically efficient method - each iteration involving a very small subset of the data. The variance reduction version of SMD (VRSMD) can further improve SMD by inducing faster convergence. On the other hand, algorithms such as gradient descent and stochastic gradient descent have the implicit regularization property that leads to better performance in terms of the generalization errors. Little is known on whether such a property holds for VRSMD. We prove here that the discrete VRSMD estimator sequence converges to the minimum mirror interpolant in the linear regression. This establishes the implicit regularization property for VRSMD. As an application of the above result, we derive a model estimation accuracy result in the setting when the true model is sparse. We use numerical examples to illustrate the empirical power of VRSMD.

Improved performance properties of the CISPRT algorithm for distributed sequential detection

Liu, K., & Mei, Y. (n.d.).

Publication year

2020

Journal title

Signal Processing

Volume

172
Abstract
Abstract
In distributed sequential detection problems, local sensors observe raw local observations over time, and are allowed to communicate local information with their immediate neighborhood at each time step so that the sensors can work together to make a quick but accurate decision when testing binary hypotheses on the true raw sensor distributions. One interesting algorithm is the Consensus-Innovation Sequential Probability Ratio Test (CISPRT) algorithm proposed by Sahu and Kar (IEEE Trans. Signal Process., 2016). In this article, we present improved finite-sample properties on error probabilities and expected sample sizes of the CISPRT algorithm for Gaussian data in term of network connectivity, and more importantly, derive its sharp first-order asymptotic properties in the classical asymptotic regime when Type I and II error probabilities go to 0. The usefulness of our theoretical results are validated through numerical simulations.

Information bounds and asymptotically optimal procedures for detecting changes in decentralized decision systems

Mei, Y. (n.d.).

Publication year

2004

Journal title

IEEE International Symposium on Information Theory - Proceedings

Page(s)

249
Abstract
Abstract
Information bounds and asymptotically optimal procedures for decentralized quickest change under different scenarios were discussed. The decentralized system with limited local memory, where the sensors do not have access to their past observations was considered. It was shown that in the decentralized decision system with Gaussian sensor observations, the detection delay of the monotone likelihood ratio quantizer (MLRQ) is at most π/2-1 ≈ 57% larger than that of the optimal centralized procedure. It was found that the method can be easily extended to non-Gaussian distributions.

Information bounds and quickest change detection in decentralized decision systems

Mei, Y. (n.d.).

Publication year

2005

Journal title

IEEE Transactions on Information Theory

Volume

51

Issue

7

Page(s)

2669-2681
Abstract
Abstract
The quickest change detection problem is studied in decentralized decision systems, where a set of sensors receive independent observations and send summary messages to the fusion center, which makes a final decision. In the system where the sensors do not have access to their past observations, the previously conjectured asymptotic optimality of a procedure with a monotone likelihood ratio quantizer (MLRQ) is proved. In the case of additive Gaussian sensor noise, if the signal-to-noise ratios (SNR) at some sensors are sufficiently high, this procedure can perform as well as the optimal centralized procedure that has access to all the sensor observations. Even if all SNRs are low, its detection delay will be at most π/2 - 1 ≈ 57% larger than that of the optimal centralized procedure. Next, in the system where the sensors have full access to their past observations, the first asymptotically optimal procedure in the literature is developed. Surprisingly, the procedure has the same asymptotic performance as the optimal centralized procedure, although it may perform poorly in some practical situations because of slow asymptotic convergence. Finally, it is shown that neither past message information nor the feedback from the fusion center improves the asymptotic performance in the simplest model.

Information bounds for decentralized sequential detection

Mei, Y. (n.d.).

Publication year

2006

Page(s)

2647-2651
Abstract
Abstract
The main purpose of this paper is to develop an asymptotic theory for the decentralized sequential hypothesis testing problems under the frequentist framework. Sharp asymptotic bounds on the average sample numbers or sample sizes of sequential or fixed-sample tests are provided in the decentralized decision systems in different scenarios subject to error probabilities constraints. Asymptotically optimal tests are offered in the system with full local memory. Optimal binary quantizers are also studied in the case of additive Gaussian sensor noises.

Is average run length to false alarm always an informative criterion?

Mei, Y. (n.d.).

Publication year

2008

Journal title

Sequential Analysis

Volume

27

Issue

4

Page(s)

354-376
Abstract
Abstract
Apart from Bayesian approaches, the average run length (ARL) to false alarm has always been seen as the natural performance criterion for quantifying the propensity of a detection scheme to make false alarms, and no researchers seem to have questioned this on grounds that it does not always apply. In this article, we show that in the change-point problem with mixture prechange models, detection schemes with finite detection delays can have infinite ARLs to false alarm. We also discuss the implication of our results on the change-point problem with either exchangeable prechange models or hidden Markov models. Alternative minimax formulations with different false alarm criteria are proposed.

Jugular Venous Catheterization is Not Associated with Increased Complications in Patients with Aneurysmal Subarachnoid Hemorrhage

Akbik, F., Shi, Y., Philips, S., Pimentel-Farias, C., Grossberg, J. A., Howard, B. M., Tong, F., Cawley, C. M., Samuels, O. B., Mei, Y., & Sadan, O. (n.d.).

Publication year

2024

Journal title

Neurocritical Care
Abstract
Abstract
Background: Classic teaching in neurocritical care is to avoid jugular access for central venous catheterization (CVC) because of a presumed risk of increasing intracranial pressure (ICP). Limited data exist to test this hypothesis. Aneurysmal subarachnoid hemorrhage (aSAH) leads to diffuse cerebral edema and often requires external ventricular drains (EVDs), which provide direct ICP measurements. Here, we test whether CVC access site correlates with ICP measurements and catheter-associated complications in patients with aSAH. Methods: In a single-center retrospective cohort study, patients with aSAH admitted to Emory University Hospital between January 1, 2012, through December 31, 2020, were included. Patients were assigned by the access site of the first CVC placed. The subset of patients with an EVD were further studied. ICP measurements were analyzed using linear mixed effect models, with a binary comparison between internal-jugular (IJ) versus non-IJ access. Results: A total of 1577 patients were admitted during the study period with CVC access: subclavian (SC) (887, 56.2%), IJ (365, 23.1%), femoral (72, 4.6%), and peripheral inserted central catheter (PICC) (253, 16.0%). Traumatic pneumothorax was the most common with SC access (3.0%, p < 0.01). Catheter-associated infections did not differ between sites. Catheter-associated deep venous thrombosis was most common in femoral (8.3%) and PICC (3.6%) access (p < 0.05). A total of 1220 patients had an EVD, remained open by default, generating 351,462 ICP measurements. ICP measurements, as compared over the first 24–postinsertion hours and the next 10 days, were similar between the two groups. Subgroup analysis accounting for World Federation of Neurological Surgeons grade on presentation yielded similar results. Conclusions: Contrary to classic teaching, we find that IJ CVC placement was not associated with increased ICP in the clinical context of the largest, quantitative data set to date. Further, IJ access was the least likely to be associated with an access-site complication when compared with SC, femoral, and PICC. Together, these data support the safety, and perhaps preference, of ultrasound-guided IJ venous catheterization in neurocritically ill patients.

Contact

yajun.mei@nyu.edu 708 Broadway New York, NY, 10003