با برنامه Player FM !
759: Full Encoder-Decoder Transformers Fully Explained, with Kirill Eremenko
Manage episode 429699126 series 2532807
Encoders, cross attention and masking for LLMs: SuperDataScience Founder Kirill Eremenko returns to the SuperDataScience podcast, where he speaks with Jon Krohn about transformer architectures and why they are a new frontier for generative AI. If you’re interested in applying LLMs to your business portfolio, you’ll want to pay close attention to this episode!
This episode is brought to you by Ready Tensor, where innovation meets reproducibility, by Oracle NetSuite business software, and by Intel and HPE Ezmeral Software Solutions. Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information.
In this episode you will learn:
• How decoder-only transformers work [15:51]
• How cross-attention works in transformers [41:05]
• How encoders and decoders work together (an example) [52:46]
• How encoder-only architectures excel at understanding natural language [1:20:34]
• The importance of masking during self-attention [1:27:08]
Additional materials: www.superdatascience.com/759
977 قسمت
Manage episode 429699126 series 2532807
Encoders, cross attention and masking for LLMs: SuperDataScience Founder Kirill Eremenko returns to the SuperDataScience podcast, where he speaks with Jon Krohn about transformer architectures and why they are a new frontier for generative AI. If you’re interested in applying LLMs to your business portfolio, you’ll want to pay close attention to this episode!
This episode is brought to you by Ready Tensor, where innovation meets reproducibility, by Oracle NetSuite business software, and by Intel and HPE Ezmeral Software Solutions. Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information.
In this episode you will learn:
• How decoder-only transformers work [15:51]
• How cross-attention works in transformers [41:05]
• How encoders and decoders work together (an example) [52:46]
• How encoder-only architectures excel at understanding natural language [1:20:34]
• The importance of masking during self-attention [1:27:08]
Additional materials: www.superdatascience.com/759
977 قسمت
همه قسمت ها
×به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.