Flash Forward is a show about possible (and not so possible) future scenarios. What would the warranty on a sex robot look like? How would diplomacy work if we couldn’t lie? Could there ever be a fecal transplant black market? (Complicated, it wouldn’t, and yes, respectively, in case you’re curious.) Hosted and produced by award winning science journalist Rose Eveleth, each episode combines audio drama and journalism to go deep on potential tomorrows, and uncovers what those futures might re ...
…
continue reading
Player FM - Internet Radio Done Right
Checked 7M ago
اضافه شده در four سال پیش
محتوای ارائه شده توسط The Thesis Review and Sean Welleck. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Thesis Review and Sean Welleck یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Player FM - برنامه پادکست
با برنامه Player FM !
با برنامه Player FM !
پادکست هایی که ارزش شنیدن دارند
حمایت شده
It’s the very first episode of The Big Pitch with Jimmy Carr and our first guest is Phil Wang! And Phil’s subgenre is…This Place is Evil. We’re talking psychological torture, we’re talking gory death scenes, we’re talking Lorraine Kelly?! The Big Pitch with Jimmy Carr is a brand new comedy podcast where each week a different celebrity guest pitches an idea for a film based on one of the SUPER niche sub-genres on Netflix. From ‘Steamy Crime Movies from the 1970s’ to ‘Australian Dysfunctional Family Comedies Starring A Strong Female Lead’, our celebrity guests will pitch their wacky plot, their dream cast, the marketing stunts, and everything in between. By the end of every episode, Jimmy Carr, Comedian by night / “Netflix Executive” by day, will decide whether the pitch is greenlit or condemned to development hell! Listen on all podcast platforms and watch on the Netflix Is A Joke YouTube Channel . The Big Pitch is a co-production by Netflix and BBC Studios Audio. Jimmy Carr is an award-winning stand-up comedian and writer, touring his brand-new show JIMMY CARR: LAUGHS FUNNY throughout the USA from May to November this year, as well as across the UK and Europe, before hitting Australia and New Zealand in early 2026. All info and tickets for the tour are available at JIMMYCARR.COM Production Coordinator: Becky Carewe-Jeffries Production Manager: Mabel Finnegan-Wright Editor: Stuart Reid Producer: Pete Strauss Executive Producer: Richard Morris Executive Producers for Netflix: Kathryn Huyghue, Erica Brady, and David Markowitz Set Design: Helen Coyston Studios: Tower Bridge Studios Make Up: Samantha Coughlan Cameras: Daniel Spencer Sound: Charlie Emery Branding: Tim Lane Photography: James Hole…
[46] Yulia Tsvetkov - Linguistic Knowledge in Data-Driven NLP
Manage episode 374051468 series 2982803
محتوای ارائه شده توسط The Thesis Review and Sean Welleck. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Thesis Review and Sean Welleck یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Yulia Tsvetkov is a Professor in the Allen School of Computer Science & Engineering at the University of Washington. Her research focuses on multilingual NLP, NLP for social good, and language generation. Yulia's PhD thesis is titled "Linguistic Knowledge in Data-Driven Natural Language Processing", which she completed in 2016 at CMU. We discuss getting started in research, then move to Yulia's work in the thesis that combines ideas from linguistics and natural language processing. We discuss low-resource and multilingual NLP, large language models, and great advice about research and beyond. - Episode notes: www.wellecks.com/thesisreview/episode46.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 قسمت
Manage episode 374051468 series 2982803
محتوای ارائه شده توسط The Thesis Review and Sean Welleck. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Thesis Review and Sean Welleck یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Yulia Tsvetkov is a Professor in the Allen School of Computer Science & Engineering at the University of Washington. Her research focuses on multilingual NLP, NLP for social good, and language generation. Yulia's PhD thesis is titled "Linguistic Knowledge in Data-Driven Natural Language Processing", which she completed in 2016 at CMU. We discuss getting started in research, then move to Yulia's work in the thesis that combines ideas from linguistics and natural language processing. We discuss low-resource and multilingual NLP, large language models, and great advice about research and beyond. - Episode notes: www.wellecks.com/thesisreview/episode46.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 قسمت
همه قسمت ها
×Tianqi Chen is an Assistant Professor in the Machine Learning Department and Computer Science Department at Carnegie Mellon University and the Chief Technologist of OctoML. His research focuses on the intersection of machine learning and systems. Tianqi's PhD thesis is titled "Scalable and Intelligent Learning Systems," which he completed in 2019 at the University of Washington. We discuss his influential work on machine learning systems, starting with the development of XGBoost,an optimized distributed gradient boosting library that has had an enormous impact in the field. We also cover his contributions to deep learning frameworks like MXNet and machine learning compilation with TVM, and connect these to modern generative AI. - Episode notes: www.wellecks.com/thesisreview/episode48.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Follow Tianqi Chen on Twitter (@tqchenml) - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [47] Niloofar Mireshghallah - Auditing and Mitigating Safety Risks in Large Language Models 1:17:06
1:17:06
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:17:06
Niloofar Mireshghallah is a postdoctoral scholar at the University of Washington. Her research focuses on privacy, natural language processing, and the societal implications of machine learning. Niloofar completed her PhD in 2023 at UC San Diego, where she was advised by Taylor Berg-Kirkpatrick. Her PhD thesis is titled "Auditing and Mitigating Safety Risks in Large Language Models." We discuss her journey into research and her work on privacy and LLMs, including how privacy is defined, common attacks and mitigations, differential privacy, and the balance between memorization and generalization. - Episode notes: www.wellecks.com/thesisreview/episode47.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
Yulia Tsvetkov is a Professor in the Allen School of Computer Science & Engineering at the University of Washington. Her research focuses on multilingual NLP, NLP for social good, and language generation. Yulia's PhD thesis is titled "Linguistic Knowledge in Data-Driven Natural Language Processing", which she completed in 2016 at CMU. We discuss getting started in research, then move to Yulia's work in the thesis that combines ideas from linguistics and natural language processing. We discuss low-resource and multilingual NLP, large language models, and great advice about research and beyond. - Episode notes: www.wellecks.com/thesisreview/episode46.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
Luke Zettlemoyer is a Professor at the University of Washington and Research Scientist at Meta. His work spans machine learning and NLP, including foundational work in large-scale self-supervised pretraining of language models. Luke's PhD thesis is titled "Learning to Map Sentences to Logical Form", which he completed in 2009 at MIT. We talk about his PhD work, the path to the foundational Elmo paper, and various topics related to large language models. - Episode notes: www.wellecks.com/thesisreview/episode45.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [44] Hady Elsahar - NLG from Structured Knowledge Bases (& Controlling LMs) 1:05:56
1:05:56
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:05:56
Hady Elsahar is a Research Scientist at Naver Labs Europe. His research focuses on Neural Language Generation under constrained and controlled conditions. Hady's PhD was on interactions between Natural Language and Structured Knowledge bases for Data2Text Generation and Relation Extraction & Discovery, which he completed in 2019 at the Université de Lyon. We talk about his phd work and how it led to interests in multilingual and low-resource in NLP, as well as controlled generation. We dive deeper in controlling language models, including his interesting work on distributional control and energy-based models. - Episode notes: www.wellecks.com/thesisreview/episode44.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [43] Swarat Chaudhuri - Logics and Algorithms for Software Model Checking 1:06:18
1:06:18
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:06:18
Swarat Chaudhuri is an Associate Professor at the University of Texas. His lab studies problems at the interface of programming languages, logic and formal methods, and machine learning. Swarat's PhD thesis is titled "Logics and Algorithms for Software Model Checking", which he completed in 2007 at the University of Pennsylvania. We discuss reasoning about programs, formal methods & safer machine learning systems, and the future of program synthesis & neurosymbolic programming. - Episode notes: www.wellecks.com/thesisreview/episode43.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [42] Charles Sutton - Efficient Training Methods for Conditional Random Fields 1:18:01
1:18:01
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:18:01
Charles Sutton is a Research Scientist at Google Brain and an Associate Professor at the University of Edinburgh. His research focuses on deep learning for generating code and helping people write better programs. Charles' PhD thesis is titled "Efficient Training Methods for Conditional Random Fields", which he completed in 2008 at UMass Amherst. We start with his work in the thesis on structured models for text, and compare/contrast with today's large language models. From there, we discuss machine learning for code & the future of language models in program synthesis. - Episode notes: https://cs.nyu.edu/~welleck/episode42.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [41] Talia Ringer - Proof Repair 1:19:02
1:19:02
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:19:02
Talia Ringer is an Assistant Professor with the Programming Languages, Formal Methods, and Software Engineering group at University of Illinois Urbana-Champaign. Her research focuses on formal verification and proof engineering technologies. Talia's PhD thesis is titled "Proof Repair", which she completed in 2021 at the University of Washington. We discuss software verification and her PhD work on proof repair for maintaining verified systems, and discuss the intersection of machine learning with her work. - Episode notes: https://cs.nyu.edu/~welleck/episode41.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
Lisa Lee is a Research Scientist at Google Brain. Her research focuses on building AI agents that can learn and adapt like humans and animals do. Lisa's PhD thesis is titled "Learning Embodied Agents with Scalably-Supervised Reinforcement Learning", which she completed in 2021 at Carnegie Mellon University. We talk about her work in the thesis on reinforcement learning, including exploration, learning with weak supervision, and embodied agents, and cover various topics related to trends in reinforcement learning. - Episode notes: https://cs.nyu.edu/~welleck/episode40.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [39] Burr Settles - Curious Machines: Active Learning with Structured Instances 1:06:33
1:06:33
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:06:33
Burr Settles leads the research group at Duolingo, a language-learning website and mobile app whose mission is to make language education free and accessible to everyone. Burr’s PhD thesis is titled "Curious Machines: Active Learning with Structured Instances", which he completed in 2008 at the University of Wisconsin-Madison. We talk about his work in the thesis on active learning, then chart the path to Burr’s role at DuoLingo. We discuss machine learning for education and language learning, including content, assessment, and the exciting possibilities opened by recent advancements. - Episode notes: https://cs.nyu.edu/~welleck/episode39.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations 1:04:47
1:04:47
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:04:47
Andrew Lampinen is a research scientist at DeepMind. His research focuses on cognitive flexibility and generalization. Andrew’s PhD thesis is titled "A Computational Framework for Learning and Transforming Task Representations", which he completed in 2020 at Stanford University. We talk about cognitive flexibility in brains and machines, centered around his work in the thesis on meta-mapping. We cover a lot of interesting ground, including complementary learning systems and memory, compositionality and systematicity, and the role of symbols in machine learning. - Episode notes: https://cs.nyu.edu/~welleck/episode38.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [37] Joonkoo Park - Neural Substrates of Visual Word and Number Processing 1:09:28
1:09:28
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:09:28
Joonkoo Park is an Associate Professor and Honors Faculty in the Department of Psychological and Brain Sciences at UMass Amherst. He leads the Cognitive and Developmental Neuroscience Lab, focusing on understanding the developmental mechanisms and neurocognitive underpinnings of our knowledge about number and mathematics. Joonkoo’s PhD thesis is titled "Experiential Effects on the Neural Substrates of Visual Word and Number Processing", which he completed in 2011 at the University of Michigan. We talk about numerical processing in the brain, starting with nature vs. nurture, including the learned versus built-in aspects of neural architectures. We talk about the difference between word and number processing, types of numerical thinking, and symbolic vs. non-symbolic numerical processing. - Episode notes: https://cs.nyu.edu/~welleck/episode37.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [36] Dieuwke Hupkes - Hierarchy and Interpretability in Neural Models of Language Processing 1:02:26
1:02:26
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:02:26
Dieuwke Hupkes is a Research Scientist at Facebook AI Research and the scientific manager of the Amsterdam unit of ELLIS. Dieuwke's PhD thesis is titled, "Hierarchy and Interpretability in Neural Models of Language Processing", which she completed in 2020 at the University of Amsterdam. We discuss her work on which aspects of hierarchical compositionality and syntactic structure can be learned by recurrent neural networks, how these models can serve as explanatory models of human language processing, what compositionality actually means, and a lot more. - Episode notes: https://cs.nyu.edu/~welleck/episode36.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [35] Armando Solar-Lezama - Program Synthesis by Sketching 1:15:56
1:15:56
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:15:56
Armando Solar-Lezama is a Professor at MIT, and the Associate Director & COO of CSAIL. He leads the Computer Assisted Programming Group, focused on program synthesis. Armando’s PhD thesis is titled, "Program Synthesis by Sketching", which he completed in 2008 at UC Berkeley. We talk about program synthesis & his work on Sketch, how machine learning's role in program synthesis has evolved over time, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode35.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [34] Sasha Rush - Lagrangian Relaxation for Natural Language Decoding 1:08:12
1:08:12
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:08:12
Sasha Rush is an Associate Professor at Cornell Tech and researcher at Hugging Face. His research focuses on building NLP systems that are safe, fast, and controllable. Sasha's PhD thesis is titled, "Lagrangian Relaxation for Natural Language Decoding", which he completed in 2014 at MIT. We talk about his work in the thesis on decoding in NLP, how it connects with today, and many interesting topics along the way such as the role of engineering in machine learning, breadth vs. depth, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode34.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [33] Michael R. Douglas - G/H Conformal Field Theory 1:12:58
1:12:58
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:12:58
Michael R. Douglas is a theoretical physicist and Professor at Stony Brook University, and Visiting Scholar at Harvard University. His research focuses on string theory, theoretical physics and its relations to mathematics. Michael's PhD thesis is titled, "G/H Conformal Field Theory", which he completed in 1988 at Caltech. We talk about working with Feynman, Sussman, and Hopfield during his PhD days, the superstring revolutions and string theory, and machine learning's role in the future of science and mathematics. - Episode notes: https://cs.nyu.edu/~welleck/episode33.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [32] Andre Martins - The Geometry of Constrained Structured Prediction 1:26:39
1:26:39
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:26:39
Andre Martins is an Associate Professor at IST and VP of AI Research at Unbabel in Lisbon, Portugal. His research focuses on natural language processing and machine learning. Andre’s PhD thesis is titled, "The Geometry of Constrained Structured Prediction: Applications to Inference and Learning of Natural Language Syntax", which he completed in 2012 at Carnegie Mellon University and IST. We talk about his work in the thesis on structured prediction in NLP, and discuss connections between his thesis work on later work on sparsity, sparse communication, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode32.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [31] Jay McClelland - Preliminary Letter Identification in the Perception of Words and Nonwords 1:33:51
1:33:51
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:33:51
Jay McClelland is a Professor in the Psychology Department and Director of the Center for Mind, Brain, Computation and Technology at Stanford. His research addresses a broad range of topics in cognitive science and cognitive neuroscience, including Parallel Distributed Processing (PDP). Jay's PhD thesis is titled "Preliminary Letter Identification in the Perception of Words and Nonwords", which he completed in 1975 at University of Pennsylvania. We discuss his work in the thesis on the word superiority effect, how it led to the Integrated Activation model, the path to Parallel Distributed Processing and the connectionist revolution, and distributed vs rule-based and symbolic approaches to modeling human cognition and artificial intelligence. - Episode notes: https://cs.nyu.edu/~welleck/episode31.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [30] Dustin Tran - Probabilistic Programming for Deep Learning 1:02:50
1:02:50
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:02:50
Dustin Tran is a research scientist at Google Brain. His research focuses on advancing science and intelligence, including areas involving probability, programs, and neural networks. Dustin’s PhD thesis is titled "Probabilistic Programming for Deep Learning", which he completed in 2020 at Columbia University. We discuss the intersection of probabilistic modeling and deep learning, including the Edward library and the novel inference algorithms and models that he developed in the thesis. - Episode notes: https://cs.nyu.edu/~welleck/episode30.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [29] Tengyu Ma - Non-convex Optimization for Machine Learning 1:17:22
1:17:22
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:17:22
Tengyu Ma is an Assistant Professor at Stanford University. His research focuses on deep learning and its theory, as well as various topics in machine learning. Tengyu's PhD thesis is titled "Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding", which he completed in 2017 at Princeton University. We discuss theory in machine learning and deep learning, including the 'all local minima are global minima' property, overparameterization, as well as perspectives that theory takes on understanding deep learning. - Episode notes: https://cs.nyu.edu/~welleck/episode29.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [28] Karen Ullrich - A Coding Perspective on Deep Latent Variable Models 1:06:20
1:06:20
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:06:20
Karen Ullrich is a Research Scientist at FAIR. Her research focuses on the intersection of information theory and probabilistic machine learning and deep learning. Karen's PhD thesis is titled "A coding perspective on deep latent variable models", which she completed in 2020 at The University of Amsterdam. We discuss information theory & the minimum description length principle, along with her work in the thesis on compression and communication. Episode notes: https://cs.nyu.edu/~welleck/episode28.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
Danqi Chen is an assistant professor at Princeton University, co-leading the Princeton NLP Group. Her research focuses on fundamental methods for learning representations of language and knowledge, and practical systems including question answering, information extraction and conversational agents. Danqi’s PhD thesis is titled "Neural Reading Comprehension and Beyond", which she completed in 2018 at Stanford University. We discuss her work on parsing, reading comprehension and question answering. Throughout we discuss progress in NLP, fundamental challenges, and what the future holds. Episode notes: https://cs.nyu.edu/~welleck/episode27.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [26] Kevin Ellis - Algorithms for Learning to Induce Programs 1:17:49
1:17:49
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:17:49
Kevin Ellis is an assistant professor at Cornell and currently a research scientist at Common Sense Machines. His research focuses on artificial intelligence, program synthesis, and neurosymbolic models. Kevin's PhD thesis is titled "Algorithms for Learning to Induce Programs", which he completed in 2020 at MIT. We discuss Kevin’s work at the intersection of machine learning and program induction, including inferring graphics programs from images and drawings, DreamCoder, and more. Episode notes: https://cs.nyu.edu/~welleck/episode26.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [25] Tomas Mikolov - Statistical Language Models Based on Neural Networks 1:19:17
1:19:17
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:19:17
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [24] Martin Arjovsky - Out of Distribution Generalization in Machine Learning 1:02:48
1:02:48
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:02:48
Martin Arjovsky is a postdoctoral researcher at INRIA. His research focuses on generative modeling, generalization, and exploration in RL. Martin's PhD thesis is titled "Out of Distribution Generalization in Machine Learning", which he completed in 2019 at New York University. We discuss his work on the influential Wasserstein GAN early in his PhD, then discuss his thesis work on out-of-distribution generalization which focused on causal invariance and invariant risk minimization. Episode notes: https://cs.nyu.edu/~welleck/episode24.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [23] Simon Du - Gradient Descent for Non-convex Problems in Modern Machine Learning 1:06:30
1:06:30
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:06:30
Simon Shaolei Du is an Assistant Professor at the University of Washington. His research focuses on theoretical foundations of deep learning, representation learning, and reinforcement learning. Simon's PhD thesis is titled "Gradient Descent for Non-convex Problems in Modern Machine Learning", which he completed in 2019 at Carnegie Mellon University. We discuss his work related to the theory of gradient descent for challenging non-convex problems that we encounter in deep learning. We cover various topics including connections with the Neural Tangent Kernel, theory vs. practice, and future research directions. Episode notes: https://cs.nyu.edu/~welleck/episode23.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [22] Graham Neubig - Unsupervised Learning of Lexical Information 1:02:30
1:02:30
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:02:30
Graham Neubig is an Associate Professor at Carnegie Mellon University. His research focuses on language and its role in human communication, with the goal of breaking down barriers in human-human or human-machine communication through the development of NLP technologies. Graham’s PhD thesis is titled "Unsupervised Learning of Lexical Information for Language Processing Systems", which he completed in 2012 at Kyoto University. We discuss his PhD work related to the fundamental processing units that NLP systems use to process text, including non-parametric Bayesian models, segmentation, and alignment problems, and discuss how his perspective on machine translation has evolved over time. Episode notes: http://cs.nyu.edu/~welleck/episode22.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at http://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [21] Michela Paganini - Machine Learning Solutions for High Energy Physics 1:07:43
1:07:43
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:07:43
Michela Paganini is a Research Scientist at DeepMind. Her research focuses on investigating ways to compress and scale up neural networks. Michela's PhD thesis is titled "Machine Learning Solutions for High Energy Physics", which she completed in 2019 at Yale University. We discuss her PhD work on deep learning for high energy physics, including jet tagging and fast simulation for the ATLAS experiment at the Large Hadron Collider, and the intersection of machine learning and physics. Episode notes: https://cs.nyu.edu/~welleck/episode21.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [20] Josef Urban - Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics 1:25:18
1:25:18
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:25:18
Josef Urban is a Principal Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research focuses on artificial intelligence for large-scale computer-assisted reasoning. Josef's PhD thesis is titled "Exploring and Combining Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics", which he completed in 2004 at Charles University in Prague. We discuss his PhD work on the Mizar Problems for Theorem Proving, machine learning for premise selection, and how it evolved into his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode20.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [19] Dumitru Erhan - Understanding Deep Architectures and the Effect of Unsupervised Pretraining 1:20:03
1:20:03
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:20:03
Dumitru Erhan is a Research Scientist at Google Brain. His research focuses on understanding the world with neural networks. Dumitru's PhD thesis is titled "Understanding Deep Architectures and the Effect of Unsupervised Pretraining", which he completed in 2010 at the University of Montreal. We discuss his work in the thesis on understanding deep networks and unsupervised pretraining, his perspective on deep learning's development, and the path of ideas to his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode19.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [18] Eero Simoncelli - Distributed Representation and Analysis of Visual Motion 1:25:37
1:25:37
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:25:37
Eero Simoncelli is a Professor of Neural Science, Mathematics, Data Science, and Psychology at New York University. His research focuses on representation and analysis of visual information. Eero's PhD thesis is titled "Distributed Representation & Analysis of Visual Motion", which he completed in 1993 at MIT. We discuss his PhD work which focused on optical flow, which ideas and methods have stayed with him throughout his career, making biological connections with machine learning models, and how Eero's perspective of vision has evolved. Episode notes: https://cs.nyu.edu/~welleck/episode18.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [17] Paul Middlebrooks - Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex 1:36:10
1:36:10
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:36:10
Paul Middlebrooks is a neuroscientist and host of the Brain Inspired podcast, which explores the intersection of neuroscience and artificial intelligence. Paul's PhD thesis is titled "Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex", which he completed at the University of Pittsburgh in 2011. We discuss Paul's work on meta-cognition - informally, thinking about thinking - then discuss neuroscience for A.I. and A.I. for neuroscience. Episode notes: https://cs.nyu.edu/~welleck/episode17.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.patreon.com/thesisreview…
T
The Thesis Review

1 [16] Aaron Courville - A Latent Cause Theory of Classical Conditioning 1:19:21
1:19:21
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:19:21
Aaron Courville is a Professor at the University of Montreal. His research focuses on the development of deep learning models and methods. Aaron's PhD thesis is titled "A Latent Cause Theory of Classical Conditioning", which he completed at Carnegie Mellon University in 2006. We discuss Aaron's work on the latent cause theory during his PhD, talk about how Aaron moved into machine learning and deep learning research, chart a path to today's deep learning methods, and discuss his recent work on systematic generalization in language. Episode notes: cs.nyu.edu/~welleck/episode16.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [15] Christian Szegedy - Some Applications of the Weighted Combinatorial Laplacian 1:06:52
1:06:52
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:06:52
Christian Szegedy is a Research Scientist at Google. His research machine learning methods such as the inception architecture, batch normalization and adversarial examples, and he currently investigates machine learning for mathematical reasoning. Christian’s PhD thesis is titled "Some Applications of the Weighted Combinatorial Laplacian" which he completed in 2005 at the University of Bonn. We discuss Christian’s background in mathematics, his PhD work on areas of both pure and applied mathematics, and his path into machine learning research. Finally, we discuss his recent work with using deep learning for mathematical reasoning and automatically formalizing mathematics. Episode notes: https://cs.nyu.edu/~welleck/episode15.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [14] Been Kim - Interactive and Interpretable Machine Learning Models 1:04:22
1:04:22
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:04:22
Been Kim is a Research Scientist at Google Brain. Her research focuses on designing high-performance machine learning methods that make sense to humans. Been's PhD thesis is titled "Interactive and Interpretable Machine Learning Models for Human Machine Collaboration", which she completed in 2015 at MIT. We discuss her work on interpretability, including her work in the thesis on the Bayesian Case Model and its interactive version, as well as connections with her subsequent work on black-box interpretability methods that are used in many real-world applications. Episode notes: https://cs.nyu.edu/~welleck/episode14.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling 1:07:50
1:07:50
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:07:50
Adji Bousso Dieng is currently a Research Scientist at Google AI, and will be starting as an assistant professor at Princeton University in 2021. Her research focuses on combining probabilistic graphical modeling and deep learning to design models for structured high-dimensional data. Her PhD thesis is titled "Deep Probabilistic Graphical Modeling", which she completed in 2020 at Columbia University. We discuss her work on combining graphical models and deep learning, including models and algorithms, the value of interpretability and probabilistic models, as well as applications and making an impact through research. Episode notes: https://cs.nyu.edu/~welleck/episode13.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [12] Martha White - Regularized Factor Models 1:08:35
1:08:35
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:08:35
Martha White is an Associate Professor at the University of Alberta. Her research focuses on developing reinforcement learning and representation learning techniques for adaptive, autonomous agents learning on streams of data. Her PhD thesis is titled "Regularized Factor Models", which she completed in 2014 at the University of Alberta. We discuss the regularized factor model framework, which unifies many machine learning methods and led to new algorithms and applications. We talk about sparsity and how it also appears in her later work, as well as the common threads between her thesis work and her research in reinforcement learning. Episode notes: https://cs.nyu.edu/~welleck/episode12.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [11] Jacob Andreas - Learning from Language 1:19:43
1:19:43
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:19:43
Jacob Andreas is an Assistant Professor at MIT, where he leads the language and intelligence group, focusing on language as a communicative and computational tool. His PhD thesis is titled "Learning from Language" which he completed in 2018 at UC Berkeley. We discuss compositionality and neural module networks, the intersection of RL and language, and translating a neural communication channel called 'neuralese', and how this can lead to more interpretable machine learning models. Episode notes: https://cs.nyu.edu/~welleck/episode11.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
Chelsea Finn is an Assistant Professor at Stanford University, where she leads the IRIS lab that studies intelligence through robotic interaction at scale. Her PhD thesis is titled "Learning to Learn with Gradients", which she completed in 2018 at UC Berkeley. Chelsea received the prestigious ACM Doctoral Dissertation Award for her work in the thesis. We discuss machine learning for robotics, focusing on learning-to-learn - also known as meta-learning - and her work on the MAML algorithm during her PhD, as well as the future of robotics research. Episode notes: https://cs.nyu.edu/~welleck/episode10.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [09] Kenneth Stanley - Efficient Evolution of Neural Networks through Complexification 1:21:26
1:21:26
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:21:26
Kenneth Stanley is a researcher at OpenAI, where he leads the team on Open-endedness. Previously he was a Professor Computer Science at the University of Central Florida, cofounder of Geometric Intelligence, and head of Core AI research at Uber AI labs. His PhD thesis is titled "Efficient Evolution of Neural Networks through Complexification", which he completed on 2004 at the University of Texas. We talk about evolving increasingly complex structures and how this led to the NEAT algorithm that he developed during his PhD. We discuss his research directions related to open-endedness, how the field has changed over time, and how he currently views algorithms that were developed over a decade ago. Episode notes: https://cs.nyu.edu/~welleck/episode9.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [08] He He - Sequential Decisions and Predictions in NLP 1:00:39
1:00:39
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:00:39
He He is an Assistant Professor at New York University. Her research focuses on enabling reliable communication in natural language between machine and humans, including topics in text generation, robust language understanding, and dialogue systems. Her PhD thesis is titled "Sequential Decisions and Predictions in NLP", which she completed in 2016 at the University of Maryland. We talk about the intersection of language with imitation learning and reinforcement learning, her work in the thesis on opponent modeling and simultaneous translation, and how it relates to recent work on generation and robustness. Episode notes: https://cs.nyu.edu/~welleck/episode8.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [07] John Schulman - Optimizing Expectations: From Deep RL to Stochastic Computation Graphs 1:04:28
1:04:28
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:04:28
John Schulman is a Research Scientist and co-founder of Open AI. John co-leads the reinforcement learning team, researching algorithms that safely and efficiently learn by trial and error and by imitating humans. His PhD thesis is titled "Optimizing Expectations: From Deep Reinforcement Learning to Stochastic Computation Graphs", which he completed in 2016 at Berkeley. We talk about his work on stochastic computation graphs and TRPO, how it evolved to PPO and how it's used in large-scale applications like Open AI Five, as well as his recent work on generalization in RL. Episode notes: https://cs.nyu.edu/~welleck/episode7.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [06] Yoon Kim - Deep Latent Variable Models of Natural Language 1:05:50
1:05:50
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:05:50
Yoon Kim is currently a Research Scientist at the MIT-IBM AI Watson Lab, and will be joining MIT as an assistant professor in 2021. Yoon’s research focuses on machine learning and natural language processing. His PhD thesis is titled "Deep Latent Variable Models of Natural Language", which he completed in 2020 at Harvard University. We discuss his work on uncovering latent structure in natural language, including continuous vector representations, tree structures, and grammars. We cover learning and variational inference methods that he developed during his PhD, and he offers a look at where latent variable models will be heading in the future. Episode notes: https://cs.nyu.edu/~welleck/episode6.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [05] Julian Togelius - Computational Intelligence and Games 1:12:06
1:12:06
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:12:06
Julian Togelius is an Associate Professor at New York University, where he co-directs the NYU Game Innovation Lab. His research is at the intersection of computational intelligence and computer games. His PhD thesis is titled "Optimization, Imitation, and Innovation: Computational Intelligence and Games", which he completed in 2007. We cover his work in the thesis on AI for games and games for AI, and how it connects to his recent work on procedural content generation. Episode notes: https://cs.nyu.edu/~welleck/episode5.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [04] Sebastian Nowozin - Learning with Structured Data: Applications to Computer Vision 1:44:32
1:44:32
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:44:32
Sebastian Nowozin is currently a Researcher at Microsoft Research Cambridge. His research focuses on probabilistic deep learning, consequences of model misspecification, understanding agent complexity in order to improve learning efficiency, and designing models for reasoning and planning. His PhD thesis is titled "Learning with Structured Data: Applications to Computer Vision", which he completed in 2009. We discuss the work in his thesis on structured inputs and structured outputs, which involves beautiful ideas from polyhedral combinatorics and optimization. We talk about his recent work on Bayesian deep learning and the connections it has to ideas that he explored during his PhD. Episode notes: https://cs.nyu.edu/~welleck/episode4.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [03] Sebastian Ruder - Neural Transfer Learning for Natural Language Processing 1:24:32
1:24:32
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:24:32
Sebastian Ruder is currently a Research Scientist at Deepmind. His research focuses on transfer learning for natural language processing, and making machine learning and NLP more accessible. His PhD thesis is titled "Neural Transfer Learning for Natural Language Processing", which he completed in 2019. We cover transfer learning from philosophical and technical perspectives, and talk about its societal implications, focusing on his work on sequential transfer learning and cross-lingual learning. Episode notes: https://cs.nyu.edu/~welleck/episode3.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [02] Colin Raffel - Learning-Based Methods for Comparing Sequences 1:15:54
1:15:54
پخش در آینده
پخش در آینده
لیست ها
پسندیدن
دوست داشته شد1:15:54
Colin Raffel is currently a Senior Research Scientist at Google Brain, and soon to be an assistant professor at the University of North Carolina. His recent work focuses on transfer learning and learning from limited labels. His thesis is titled "Learning-Based Methods for Comparing Sequences, with Applications to Audio-to-MIDI Alignment and Matching", which we discuss along with the connections to his later work, and plans for the future. Episode notes: https://cs.nyu.edu/~welleck/episode2.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

Gus Xia is an assistant professor at New York University Shanghai. His research explores machine learning for music, with a goal of building intelligent systems that understand and extend musical creativity and expression. His PhD thesis is titled Expressive Collaborative Music Performance via Machine Learning, which we discuss in depth along with his ongoing research at the NYU Shanghai Music X Lab. - Gus Xia's homepage: https://www.cs.cmu.edu/~gxia/ - Thesis: http://reports-archive.adm.cs.cmu.edu/anon/ml2016/CMU-ML-16-103.pdf Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
[00] The Thesis Review Podcast - Introduction by Sean Welleck
به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.