Some Notes on Functional Analysis

*These notes are based heavily off of Folland's Real Analysis and other resources, with my own commentary thrown in*

Functional Analysis generalizes the techniques of real and complex analysis to vector spaces of infinite dimension over $\mathbb{R}$ and $\mathbb{C}$.

Normed Vector Spaces

Let $K$ be a field (typically $\mathbb{R}$ or $\mathbb{C}$) and $\mathcal{X}$ a vector space over $K$.
Definition: A seminorm on $\mathcal{X}$ is a function $x \to \| x \|$ from $\mathcal{X}$ to $[0, \infty)$ satisfying
  • $\| x + y \| \leq \|x \| + \| y \|$ for all $x, y \in \mathcal{X}$
  • $\| \lambda x\| = |\lambda | \|x \|$ for all $x \in \mathcal{X}$ and $\lambda \in K$.

  • A seminorm becomes a norm provided that $\| x \| = 0 \implies x = 0$. A vector space equipped with a norm is called a normed vector space.
    The function $\rho (x, y) = \| x - y\|$ induces a metric on $\mathcal{X}$ and defines a topology called the norm topology. We say that two norms $\| \cdot \|_1, \| \cdot \|_2$ are equivalent if there exist $C_1, C_2 > 0$ such that $$C_1 \|x\|_1 \leq \|x\|_2 \leq C_2\|x\|_1 \quad \forall x \in X$$ A normed vector space which is complete wrt the norm metric is called a Banach Space.
    Theorem: A normed vector space in $\mathcal{X}$ is complete iff every absolutely convergent series in $\mathcal{X}$ converges.
    Proof: Suppose $\mathcal{X}$ is complete and $\sum_1^\infty \| x_n \| < \infty$, let $S_N = \sum_1^N x_n$. Then for $N > M$ we have $$\| S_N - S_M \| \leq \sum_{M+1}^N \| x_n \| \to 0 \ \text{as} \ M, N \to \infty$$ so the sequence $\{S_n \}$ is Cauchy and hence convergent. Conversely, suppose that every absolutely convergent series converges, and let \{ x_n \} be a Cauchy sequence. We can choose $n_1 < n_2 < \cdots$ such that $\|x_n - x_m\| < 2^{-j}$ for $m, n \geq n_j$. Let $y_1 = x_{n_1}$ and $y_j = x_{n_j} - x_{n_{j-1}}$ for $j > 1$. Then $\sum_{1}^k y_j = x_{n_k}$, and $$ \sum_{1}^\infty \| y_j \| \leq \| y_1 \| + \sum_1^\infty 2^{-j} = \|y_1\| + 1 < \infty$$, so $\lim x_{n_k} - \sum_{1}^\infty y_j$ exists. But since $\{ x_n \}$ is Cauchy, it is easily verified that $\{ x_n \}$ converges to the same limit as $\{x_{n_k}\}$.

    Now we move onto the really interesting stuff, examining the behavior of linear maps on normed vector spaces of possibly infinite dimension. There are quite a few parallels to linear algebra on finite-dimensional spaces, along with a few subtle differences.

    The definition of boundedness linear maps on normed spaces changes from what we're accustomed to with functions on a set.

    Definition: A linear map $T:\mathcal{X} \to \mathcal{Y}$ (normed vector spaces) is bounded provided there exists $C \geq 0$ such that $\| Tx \| \leq C \|x \|$ for all $x \in \mathcal{X}$. The dependence of the criterion on $\| x \|$ is what differentiates this notion of boundedness from that of boundedness on a set.

    The next proposition gives some clarity as to the behavior of linear maps on normed vector spaces, particularly continuous and bounded maps. In fact, there exists an equivalency between the two.

    Proposition: Let $T: \mathcal{X} \to \mathcal{Y}$ be a linear map on normed vector spaces $\mathcal{X}, \mathcal{Y}$. Then the following are equivalent.
  • $T$ is continuous
  • $T$ is continuous at 0
  • $T$ is bounded


  • Proof: That a. implies b. is obvious. For b. implies c., suppose $T$ is continuous at 0. Then by the definition of continuity, there exists a neighborhood $U$ of 0 such that $T(U) \subset \{y \in \mathcal{Y} \mid \| y \| \leq 1\}$. Recall that by the definition of a linear map, we by necessity have $T(0) = 0$. Thus there exists a ball $B = \{ x \in \mathcal{X} \mid \| x \| \leq \delta \} \subset U$ around 0. Therefore, $\| Tx \| \leq 1$ when $x \in B$. So $\| Tx \| \leq a \delta^{-1}$ whenever $\| x \| \leq a \imples \|Tx \| \leq \delta^{-1} \| x \|$. Thus, b. $\implies$ c. Finally, suppose $T$ is bounded so that $\| T x \| \leq C\| x \|$ for all $x$. Then $\| Tx_1 - Tx_2 \| = \|T(x_1 - x_2) \| \leq \varepsilon$ whenever $\|x_1 - x_2\| \leq C^{-1}\varepsilon$. This concludes the proof.