13 subscribers
با برنامه Player FM !
پادکست هایی که ارزش شنیدن دارند
حمایت شده
"Introduction to abstract entropy" by Alex Altair
Manage episode 424655342 series 3364760
https://www.lesswrong.com/posts/REA49tL5jsh69X3aM/introduction-to-abstract-entropy#fnrefpi8b39u5hd7
This post, and much of the following sequence, was greatly aided by feedback from the following people (among others): Lawrence Chan, Joanna Morningstar, John Wentworth, Samira Nedungadi, Aysja Johnson, Cody Wild, Jeremy Gillen, Ryan Kidd, Justis Mills and Jonathan Mustin. Illustrations by Anne Ore.
Introduction & motivation
In the course of researching optimization, I decided that I had to really understand what entropy is.[1] But there are a lot of other reasons why the concept is worth studying:
- Information theory:
- Entropy tells you about the amount of information in something.
- It tells us how to design optimal communication protocols.
- It helps us understand strategies for (and limits on) file compression.
- Statistical mechanics:
- Entropy tells us how macroscopic physical systems act in practice.
- It gives us the heat equation.
- We can use it to improve engine efficiency.
- It tells us how hot things glow, which led to the discovery of quantum mechanics.
- Epistemics (an important application to me and many others on LessWrong):
- The concept of entropy yields the maximum entropy principle, which is extremely helpful for doing general Bayesian reasoning.
- Entropy tells us how "unlikely" something is and how much we would have to fight against nature to get that outcome (i.e. optimize).
- It can be used to explain the arrow of time.
- It is relevant to the fate of the universe.
- And it's also a fun puzzle to figure out!
I didn't intend to write a post about entropy when I started trying to understand it. But I found the existing resources (textbooks, Wikipedia, science explainers) so poor that it actually seems important to have a better one as a prerequisite for understanding optimization! One failure mode I was running into was that other resources tended only to be concerned about the application of the concept in their particular sub-domain. Here, I try to take on the task of synthesizing the abstract concept of entropy, to show what's so deep and fundamental about it. In future posts, I'll talk about things like:
فصل ها
1. "Introduction to abstract entropy" by Alex Altair (00:00:00)
2. Introduction & motivation (00:00:09)
3. Abstract definition (00:03:38)
4. Macrostates (00:06:07)
5. Dice Diagram (00:07:27)
6. Two basic strategies for distinguishing states (00:08:07)
7. Binary string labels (00:08:52)
8. Image: Binary bar-codes (00:10:47)
9. Image: first 31 binary strings (00:13:46)
10. Equation 1 (00:21:27)
11. Equation 2 (00:21:50)
12. Equation 3 (00:22:16)
13. Equation 4 (00:22:48)
14. Yes/no questions (00:23:32)
15. Image: guess who (00:25:06)
16. How they compare (00:26:35)
17. Exactly what is a bit? (00:29:13)
18. Diagram: entropy of a system (00:32:48)
19. Probabilities over states (00:33:28)
20. Equation 5 (00:33:52)
21. Equation 6 (00:38:07)
22. Equation 7 (00:38:13)
23. Negentropy (00:39:46)
24. Equation 8 (00:40:18)
25. Equation 9 (00:43:08)
26. What's next (00:44:04)
538 قسمت
Manage episode 424655342 series 3364760
https://www.lesswrong.com/posts/REA49tL5jsh69X3aM/introduction-to-abstract-entropy#fnrefpi8b39u5hd7
This post, and much of the following sequence, was greatly aided by feedback from the following people (among others): Lawrence Chan, Joanna Morningstar, John Wentworth, Samira Nedungadi, Aysja Johnson, Cody Wild, Jeremy Gillen, Ryan Kidd, Justis Mills and Jonathan Mustin. Illustrations by Anne Ore.
Introduction & motivation
In the course of researching optimization, I decided that I had to really understand what entropy is.[1] But there are a lot of other reasons why the concept is worth studying:
- Information theory:
- Entropy tells you about the amount of information in something.
- It tells us how to design optimal communication protocols.
- It helps us understand strategies for (and limits on) file compression.
- Statistical mechanics:
- Entropy tells us how macroscopic physical systems act in practice.
- It gives us the heat equation.
- We can use it to improve engine efficiency.
- It tells us how hot things glow, which led to the discovery of quantum mechanics.
- Epistemics (an important application to me and many others on LessWrong):
- The concept of entropy yields the maximum entropy principle, which is extremely helpful for doing general Bayesian reasoning.
- Entropy tells us how "unlikely" something is and how much we would have to fight against nature to get that outcome (i.e. optimize).
- It can be used to explain the arrow of time.
- It is relevant to the fate of the universe.
- And it's also a fun puzzle to figure out!
I didn't intend to write a post about entropy when I started trying to understand it. But I found the existing resources (textbooks, Wikipedia, science explainers) so poor that it actually seems important to have a better one as a prerequisite for understanding optimization! One failure mode I was running into was that other resources tended only to be concerned about the application of the concept in their particular sub-domain. Here, I try to take on the task of synthesizing the abstract concept of entropy, to show what's so deep and fundamental about it. In future posts, I'll talk about things like:
فصل ها
1. "Introduction to abstract entropy" by Alex Altair (00:00:00)
2. Introduction & motivation (00:00:09)
3. Abstract definition (00:03:38)
4. Macrostates (00:06:07)
5. Dice Diagram (00:07:27)
6. Two basic strategies for distinguishing states (00:08:07)
7. Binary string labels (00:08:52)
8. Image: Binary bar-codes (00:10:47)
9. Image: first 31 binary strings (00:13:46)
10. Equation 1 (00:21:27)
11. Equation 2 (00:21:50)
12. Equation 3 (00:22:16)
13. Equation 4 (00:22:48)
14. Yes/no questions (00:23:32)
15. Image: guess who (00:25:06)
16. How they compare (00:26:35)
17. Exactly what is a bit? (00:29:13)
18. Diagram: entropy of a system (00:32:48)
19. Probabilities over states (00:33:28)
20. Equation 5 (00:33:52)
21. Equation 6 (00:38:07)
22. Equation 7 (00:38:13)
23. Negentropy (00:39:46)
24. Equation 8 (00:40:18)
25. Equation 9 (00:43:08)
26. What's next (00:44:04)
538 قسمت
همه قسمت ها
×
1 “Distillation Robustifies Unlearning” by Bruce W. Lee, Addie Foote, alexinf, leni, Jacob G-W, Harish Kamath, Bryce Woodworth, cloud, TurnTrout 17:19

1 “Beware General Claims about ‘Generalizable Reasoning Capabilities’ (of Modern AI Systems)” by LawrenceC 34:11
به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.