<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Tensor Networks | Mahyar's world 🌏</title><link>https://mahyar-osanlouy.com/tag/tensor-networks/</link><atom:link href="https://mahyar-osanlouy.com/tag/tensor-networks/index.xml" rel="self" type="application/rss+xml"/><description>Tensor Networks</description><generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Fri, 04 Aug 2023 00:00:00 +0000</lastBuildDate><item><title>Why Normalizing Flows (and Tensorizing Flows) deserve more attention</title><link>https://mahyar-osanlouy.com/post/tensorizing-flows/</link><pubDate>Fri, 04 Aug 2023 00:00:00 +0000</pubDate><guid>https://mahyar-osanlouy.com/post/tensorizing-flows/</guid><description>&lt;p>Other generative models like diffusion models and autoregressive LLMs tend to steal the spotlight, since they&amp;rsquo;re great
at producing stunning images or generating text. Normalizing Flows, on the other hand, aren&amp;rsquo;t the first choice for
those headline-grabbing tasks. But if you focus only on sample quality, you might overlook what makes Normalizing Flows
truly valuable.&lt;/p>
&lt;h2 id="why-normalizing-flows-deserve-more-attention">Why Normalizing Flows Deserve More Attention&lt;/h2>
&lt;p>Most generative models are black boxes. GANs, for example, can create high-quality samples, but you can&amp;rsquo;t compute the
likelihood of a given data point. Energy-based models often only give you unnormalized densities, so you can compare
samples but not get an actual probability.&lt;/p>
&lt;p>Normalizing Flows are different. They let you map a simple base distribution (like a Gaussian) through a sequence of
invertible transformations to model complex data. The kicker? You always have access to the exact, normalized probability
density for any sample. This is a huge deal for applications where you need to know the likelihood, not just generate
data.&lt;/p>
&lt;h2 id="the-real-world-use-case-variational-inference">The Real-World Use Case: Variational Inference&lt;/h2>
&lt;p>One area where this property is crucial is Variational Inference (VI). Here, you want to approximate a complex target
distribution with a flexible, normalized family so you can do things like Bayesian inference efficiently.
NFs are a natural fit because you can both sample from them and compute exact densities—something most other models
can&amp;rsquo;t offer.&lt;/p>
&lt;h2 id="but-theres-a-catch">But There&amp;rsquo;s a Catch&amp;hellip;&lt;/h2>
&lt;p>Traditional NFs use a Gaussian as their base distribution. This works fine for unimodal targets, but if your true
distribution is multimodal (think: multiple peaks), NFs tend to &amp;ldquo;collapse&amp;rdquo; to just one mode. This limits their
expressiveness in VI, especially for challenging scientific or physics problems where multimodality is the norm.&lt;/p>
&lt;h2 id="enter-tensorizing-flows">Enter Tensorizing Flows&lt;/h2>
&lt;p>The paper &amp;ldquo;Tensorizing Flows: A Tool for Variational Inference&amp;rdquo; introduces a clever fix: replace the Gaussian base
with a tensor-train (TT) distribution, built using tools from tensor networks. This TT base can already capture much
of the structure (including multimodality) of the target distribution, so the flow only needs to handle the
&amp;ldquo;fine details.&amp;rdquo; The result is a model that&amp;rsquo;s both more expressive and easier to train for high-dimensional,
multimodal problems.&lt;/p>
&lt;h2 id="resources">Resources&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://arxiv.org/pdf/2305.02460" target="_blank" rel="noopener">Article&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://github.com/VincentStimper/normalizing-flows" target="_blank" rel="noopener">NormFlow&lt;/a>&lt;/li>
&lt;/ul></description></item></channel></rss>