摘要: We are concerned with the issue of detecting model changes in probability distributions. specifically consider strategies based on minimum description length (MDL) principle. theoretically analyze their basic performance from two aspects: data compression and hypothesis testing. From view compression, we derive a new bound minimax regret for changes. Here, mini–max is defined as worst-case code-length relative to least normalized maximum likelihood over all testing, reduce change detection into simple testing problem. thereby upper bounds error probabilities MDL-based test. The valid finite sample size related information-theoretic complexity well discrepancy measure hypotheses be tested.