기본 콘텐츠로 건너뛰기

Kullback-Leibler divergence

Kullback-Leibler divergence Kullback-Leibler divergence Let P \mathbf{P} P and Q \mathbf{Q} Q be discrete probability distributions with pmfs p p p and q q q respectively. Let’s also assume and have a common sample space E E E . Then the KL divergence between discrete probability distributions and is defined by KL ( P , Q ) = ∑ x ∈ E p ( x ) ln ⁡ ( p ( x ) q ( x ) ) . \text {KL}(\mathbf{P}, \mathbf{Q}) = \sum _{x \in E} p(x) \ln \left( \frac{p(x)}{q(x)} \right). KL ( P , Q ) = x ∈ E ∑ ​ p ( x ) ln ( q ( x ) p ( x ) ​ ) . KL ( P , Q ) ≥ 0 \text {KL}(\mathbf{P}, \mathbf{Q}) \geq 0 KL ( P , Q ) ≥ 0 (nonnegative) KL ( P , Q ) = 0 \text {KL}(\mathbf{P}, \mathbf{Q}) = 0 KL ( P , Q ) = 0 only if P \mathbf{P} P and Q \mathbf{Q} Q are the same distribution (definite).
최근 글

False positive, False negative, Type I error, Type II error

False positive, False negative, Type I error, Type II error False positive, False negative, Type I error, Type II error True False True Correct Type II error False Type I error Correct Binary classification Actual Class Positive Actual Class Negative Assigned Positive True Positive False Positive Assigned Negative False Negative True Negative Statistics A positive result corresponds to rejecting the null hypothesis, while a negative result corresponds to failing to reject the null hypothesis; “false” means the conclusion drawn is incorrect. Thus a type I error is a false positive, and a type II error is a false negative. Wikipedia null hypothesis( H 0 ) H_0) H 0 ​ ) True null hypothesis( H 0 ) H_0) H 0 ​ ) False Fail to reject Correct (True Negative) (1- α \alpha α , confidence level) Type II error (False Negative) ( β \beta β ) Reject Type I error (False Positive) ( α \alpha α , signif

inequalities in fundamental statistics

inequalities Hoeffding’s Inequality Given n ( n > 0 ) n(n>0) n ( n > 0 ) i.i.d. random variables X 1 , X 2 , . . . , X n ∼ i i d X_1,X_2,...,X_n \overset{iid}{\sim} X 1 ​ , X 2 ​ , . . . , X n ​ ∼ i i d that are almost surely bounded – meaning P ( X ∉ [ a , b ] ) = 0 \mathbf{P}(X \notin [a,b])=0 P ( X ∈ / ​ [ a , b ] ) = 0 : P ( ∣ X n ˉ − E [ X ] ∣ ≥ ϵ ) ≤ 2 exp ⁡ ( − 2 n ϵ 2 ( b − a ) 2 ) for all  ϵ > 0 \mathbf{P}\left(\left| \bar{X_n} - \mathbb{E}[X]\right| \ge \epsilon\right) \le 2 \exp\left(-{2n\epsilon^2 \over (b-a)^2}\right) \qquad \text{for all }\epsilon \gt 0 P ( ∣ ∣ ​ X n ​ ˉ ​ − E [ X ] ∣ ∣ ​ ≥ ϵ ) ≤ 2 exp ( − ( b − a ) 2 2 n ϵ 2 ​ ) for all  ϵ > 0 Unlike for the central limit theorem, here the sample size n n n does not need to be large. Markov inequality For a random variable X ≥ 0 X\ge 0 X ≥ 0 with mean μ > 0 \mu \gt 0 μ > 0 , and any number t > 0 t \gt 0 t > 0 : P ( X ≥ t ) ≤ μ t \mathbf{P}(X \

Vscode freeze linux with nvidia libs

System hung whenever vscode was started. INSTALLING NVIDIA DRIVERS ON RHEL OR CENTOS 7( http://www.advancedclustering.com/act_kb/installing-nvidia-drivers-rhel-centos-7/ ). Somehow other parts were completed already, maybe nvidia package install script did that. Missing parts were 1. "blacklist nouveau" in /etc/pmodprobe.d/blacklist.conf 2. mv /boot/initramfs-$(uname -r).img /boot/initramfs-$(uname -r)-nouveau.img dracut /boot/initramfs-$(uname -r).img $(uname -r)

Linux CPU Usage

Command line tools 14 Command Line Tools to Check CPU Usage in Linux( https://linoxide.com/monitoring-2/10-tools-monitor-cpu-performance-usage-linux-command-line/ ) Understanding UNDERSTANDING CPU USAGE IN LINUX( https://www.opsdash.com/blog/cpu-usage-linux.html ) /proc filesystem can be used to access information directly.