top | item 44483854

(no title)

998244353 | 7 months ago

The set of all real->real functions is still a vector space.

This vector space also has a basis (even if it is not as useful): there is a (uncountably infinite) subset of real->real functions such that every function can be expressed as a linear combination of a finite number of these basis functions, in exactly one way.

There isn't a clean way to write down this basis, though, as you need to use Zorn's lemma or equivalent to construct it.

discuss

order

seanhunter|7 months ago

If you restrict yourself to Lebesque-integrable functions, can’t you take the complex Fourier transform of the function and call the terms of the Fourier series a basis, with the coefficients being the components of the vector of the function? This is a bit above my current mathematical paygrade, so forgive me if I’m not expressing the idea accurately but I’m learning a lot both from the article and the ensuing discussion - hopefully you understand what I’m getting at.

I think what I may be asking is “Does the complex Fourier transform make a Hilbert space?” but I might be wrong both about that and about that being the right question.

MITSardine|7 months ago

Sorry, I couldn't find the page in English, but what you're talking about is a Hilbert basis: https://fr.wikipedia.org/wiki/Base_de_Hilbert . There is a paragraph on this in the orthonormal basis English page: https://en.wikipedia.org/wiki/Orthonormal_basis

Another example is the eigenvectors of linear operators like the Laplacian. Recall how, in finite dimension, the eigenvectors of a full rank operator (matrix) form an orthonormal basis of the vector space. There is a similar notion in infinite dimension. I can't find an English page that covers this very well, but there's a couple of paragraphs in the Spectral Theorem page (https://en.wikipedia.org/wiki/Spectral_theorem#Unbounded_sel... ). The article linked here also touches on this.

Regarding your last sentence, one thing to note is that having a basis is not what makes you a Hilbert space, but rather having an inner product! In fact, to get the Fourier coefficients, you need to use that inner product.

998244353|7 months ago

No, and this is where this formal notion of basis I mentioned unfortunately diverges from what is perhaps more useful in practice.

You can represent any function f: [-pi, pi] -> R as an infinite sum

    f(x) = sum_(k = 0 to infinity) (a_k sin(kx) + b_k cos(kx))
for some coefficients a_k and b_k as long as f is sufficiently nice (I don't remember the exact condition, sorry).

This is very useful, but the functions sin(x), sin(2x), ... , cos(x), cos(2x), ... don't constitute a basis in the formal sense I mentioned above as you need an infinite sum to represent most functions. It is still often called a basis though.

eapriv|7 months ago

You can’t, because a Fourier series is not a linear combination.

ttoinou|7 months ago

I'd love to read more about that, he's not talking about that at all in this article though

tel|7 months ago

If you're familiar with Zorn's Lemma, the construction is just to order bases by inclusion and to consider chains created by noting that there must be an independent dimension and adding it inductively. You can upper bound each of these chains by unioning the members of the chain (which preserves linear independence). By Zorn's Lemma that means there is a maximal linearly independent system and if an element existed outside of that system's span it would contradict that maximality.

lmm|7 months ago

He doesn't need to talk about it (though you might like to look up the notorious Hilbert's Basis Theorem); it happens to be the case that any vector space has a basis, but even if you don't know that, a vector space is still a vector space and its elements are still vectors.