Back to Boolean Function Analysis
Back to notes

# Some definitions and useful facts

50%

If not specify, n the following, we let $f:\Bit^n\rightarrow\Bit$ be arbitrary boolean function and $\mathbf{f}:\Bit^n\rightarrow\Bit$ be a random boolean function in the sense that for any $x\in\Bit^n$, the probability of $\mathbf{f}(x)=1$ is half.

## Basic Fourier analysis of boolean function

Let $S\subseteq[n]$, define the character for $S$ as $\chi_S:\Bit^n\rightarrow\Bit$ such that $\chi_S(x)=\prod_{i\in S}x_i$.

$\{\chi_S\}_{S\subseteq[n]}$ forms an orthonormal basis for $\mathcal{F}_n$, i.e., $\langle\chi_S,\chi_T\rangle=\mathbf{1}_{S=T}$

Let $f:\Bit^n\rightarrow\R$, define the Fourier expansion of $f$ as follows. $$f = \sum_{S\subseteq[n]}\widehat{f}(S)\chi_S,$$ where we call $\widehat{f}(S)$ the Fourier coefficient of $f$.

• $\widehat{f}(S) = \langle f,\chi_S\rangle$
• (Parseval) $\langle f,f\rangle = \sum_{S\subseteq[n]}\widehat{f}(S)^2$
• (Plancherel) $\langle f,g\rangle = \sum_{S\subseteq[n]}\widehat{f}(S)\widehat{g}(S)$
• $\Var[f] = \sum_{S\subseteq[n],S\neq\emptyset}\widehat{f}(S)^2$
• $\dist(f,g) := \mathbb{P}_{\bx}[f(\bx)\neq g(\bx)]$
• $\langle f,g\rangle = 1-2\dist(f,g)$
• $f$ is $\left(\max_{S\subseteq[n]}\widehat{f}(S)\right)$-close to linear

The convolution of $f$ and $g$ is $f\star g(x):=\mathbb{E}_{\by}[f(x-\by)g(\by)]$

• $f\star g(x):=\mathbb{E}_{\by}[f(x+\pm\by)g(\by)]=\mathbb{E}_{\by}[f(\by)g(x\pm\by)]$
• $\widehat{f\star g}(S) = \widehat{f}(S)\cdot\widehat{g}(S)$

## Influence, noise stability, and noise sensibility

For any $i\in[n]$, define the $i$th influence of $f$ as $$\bfI_i[f] := \mathbb{P}[f(x_1,\dots,x_{i-1},0,x_{i+1},\dots,x_n)\neq f(x_1,\dots,x_{i-1},1,x_{i+1},\dots,x_n)].$$ The total influence of $f$ is defined as the sum of coordinate-wise influence, i.e., $$\bfI[f] := \sum_{i\in[n]}\bfI_i[f].$$

• $\bfI_i[f] = \sum_{S:i\in S}\widehat{f}(S)^2$
• If $f$ is monotone, then $\bfI_i[f]=\widehat{f}(\{i\})$
• $\bfI[f]=\sum_{k\in[n]}k\cdot \mathbf{W}^k[f]$
• $\bfI[f]=2\mathbb{E}_{x\sim\Bit^n}[\sharp(-1)\text{-pivotal coordinates for }f\text{ on }x]$
• $\bfI_i[f]=\mathbb{E}_{x\sim\Bit^n}[(D_if)^2]$
• $\bfI[f]=\langle f,Lf\rangle=\langle Lf,Lf\rangle$
• (Poincare Inequality) For unbiased $f$, $\bfI[f]\geq1$.
• (KKL Theorem) $\max_{i\in[n]}{\bfI_i[f]}\geq\Var[f]\cdot\Omega(\frac{\log n}{n})$
Function Influence
$\text{Maj}_n$ $\Theta(\sqrt{n})$
DNF/CNF of width $w$ $\leq2w$
DNF/CNF of size $s$ $\leq O(\log s)$
$\text{Tribe}_n$ $=(\ln n)(1\pm o(1))$

Let $\rho\in[-1,1]$ and $x\in\Bit^n$, we say random vector $\mathbf{y}\in\Bit^n$ is $\rho$-correlated with $x$ for some $\rho\in[0,1]$ if for each $i\in[n]$, $$\mathbf{y}_i=\left\{\begin{array} \sign(\rho)x_i&,\text{w.p. }\card{\rho}\\ \text{uniformly random}&,\text{w.p. }1-\card{\rho} \end{array}\right.$$ We denote it as $\by\sim N_{\rho} x$.

Furthermore, when we draw $\bx$ uniformly from $\Bit^n$ and $\by\sim N_{\rho}(\bx)$, we call $(\bx,\by)$ a $\rho$-correlation pair.

• $\mathbb{E}[\bx_i]\mathbb{E}[\by_i]=0$
• $\mathbb{E}[\bx_i\by_i]=\rho$

Define the noise stability of $f$ as $\Stab[f]:=\mathbb{E}_{\bx,\by\sim_{\rho}x}[f(\bx)f(\by)]$

• $\Stab_{\rho}[\chi_S]=\rho^{\card{S}}$
• $\Stab_{\rho}[f] = \sum_{k\in[n]}\rho^k\bfW^k[f]$
• $\bfI = \frac{d}{d\rho}\Stab_{\rho}[f]|_{\rho=1}$

Let $x\in\Bit^n$, we say random vector $\mathbf{y}\in\Bit^n$ is yielded by $x$ with flipping probability $\delta$ for some $\delta\in[0,1]$ if for each $i\in[n]$, $$\mathbf{y}_i=\left\{\begin{array} xx_i&,\text{w.p. }1-\delta\\ -x_i&.\text{w.p. }\delta \end{array}\right.$$ We denote it as $\by\leftarrow_{\delta} x$.

• When $\delta=\frac{1}{2}-\frac{1}{2}\rho$, a.k.a., $\rho=1-2\delta$, $\bx\leftarrow_{\delta}\by$ is equivalent to $(\bx,\by)$ being a $\rho$-correlation pair.

Define the noise sensitivity of $f$ with flipping probability $\delta$ as $$\NS_{\delta}[f] := \mathbb{P}_{\bx,\by\leftarrow_{\delta}\bx}[f(\bx)\neq f(\by)].$$

• $\NS_{\delta}[f] = \frac{1}{2}-\frac{1}{2}\Stab_{1-2\delta}[f]$
• $\NS_{\delta}[f] = \frac{1}{2}\sum_{0\leq k\leq n}(1-(1-2\delta)^k)\cdot \bfW^k[f]$

For any $\rho\in[-1,1]$, define operator $T_{\rho}$ as follows. $$T_{\rho}f(x) := \mathbb{E}_{\by\sim N_{\rho}(x)}[f(\by)].$$

• $T_{\rho}f = \sum_{S\subseteq[n]}\rho^{\card{S}}\widehat{f}(S)\chi_S$
• $\Stab_{\rho}[f]=\langle f,T_{\rho}f \rangle$

Inspired by the fact that $\bfI_i[f]=\mathbb{E}_{x\sim\Bit^n}[(D_if)^2]$, we define the following notion of stable influence to connect $\bfI$ and $\Stab$.

Let $\rho\in[-1,1]$, define the $\rho$-stable influence of $f$ as $$\bfI_i^{(\rho)}[f] := \Stab_{\rho}[D_if] = \sum_{i\in S}\rho^{\card{S}-1}\widehat{f}(S)^2.$$ Similarly, define $\bfI^{(\rho)}[f] = \sum_{i\in[n]}\bfI_i^{(\rho)}[f]$.

• Let $f:\Bit^n\rightarrow\R$ such that $\Var[f]\leq1$ and $0<\delta,\epsilon<1$, we have $\card{{i\in[n]:\ \bfI_i^{(1-\delta)}[f]\geq\epsilon}}\leq\frac{1}{\delta\epsilon}$. (See this note for the proof of a key inequality)

## Spectral properties

For any $\epsilon>0$, we say a function $f:{-1,1}^n\rightarrow\mathbb{R}$ is $\epsilon$-concentrated on degree up to $k$ if $\mathbf{W}^{\geq k}[f]\leq\epsilon$.

• For any $\epsilon>0$, the Fourier spectrum of $f:\{-1,1\}^n\rightarrow\mathbb{R}$ is $\epsilon$-concentrated on degree up to $\mathbf{I}[f]/\epsilon$.
• With probability at least $1/2$, the Fourier spectrum of $f$ is not $1/4$-concentrated on degree up to $n/2$.
• LMN theorem:
• Let $f$ be computable by depth $d$ size $s$ $\mathbf{AC}$ circuit, then the the Fourier spectrum of $f$ is not $\epsilon$-concentrated up to degree $O(\log(s/\epsilon))^{d-1}\log(1/\epsilon)$.
• Hastad improved to degree up to $O(\log(s/\epsilon))^{d-2}\log s\cdot\log(1/\epsilon)$.

## Random function

In the following, let $f:\{-1,1\}^n\rightarrow\mathbb{R}$ be a random boolean function in the sense that for any $x\in\{-1,1\}^n$, the probability of $f(x)=1$ is half.

• For any $S\subseteq[n]$, $\widehat{f}(S)$ is a random variable with mean $0$ and variance $2^{-n}$.
• With probability at least $1/2$, $f$ is not $1/4$-concentrated on degree up to $n/2$.