Transformer Linearity, Face-Adapter Diffusion Models, Cross-Layer Attention Shrinks LLMs, Image Generation Breakthrough
Manage episode 419758292 series 3568650
内容由PocketPod提供。所有播客内容(包括剧集、图形和播客描述)均由 PocketPod 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal。
Your Transformer is Secretly Linear Diffusion for World Modeling: Visual Details Matter in Atari Face Adapter for Pre-Trained Diffusion Models with Fine-Grained ID and Attribute Control Reducing Transformer Key-Value Cache Size with Cross-Layer Attention OmniGlue: Generalizable Feature Matching with Foundation Model Guidance Personalized Residuals for Concept-Driven Text-to-Image Generation
…
continue reading
70集单集