Around the Web 18
Street Names🛣, Transcription Factors 🧬, and Liquid Neural Networks 🤖
[Talking🗣️points for your next outing📍]
🚶🏿♀Walking just 4,000 steps per day may be enough to help extend your life.
🧂Does a low-salt diet really improve your health?
👩💼Fifteen In-Demand jobs of the Future.
🛣 The most common street names, addresses in the United States.
[Long Read 📰 / Watch 🎥]
[I]: Transcription factors interact with RNA to regulate genes.
Very interesting research out of the field of molecular biology. Transcription factors (TF) also bind RNA. One of the paper’s authors stated: “It’s as if, after carrying around a Swiss Army knife all your life for its blade and scissors, you suddenly realize that the odd, small piece in the back of the knife is a screwdriver,” (We have always known that TF *mainly bind proteins and DNA). He continued “It’s been staring you in the face this whole time, and now that you finally see it, it becomes clear how many more uses there are for the knife than you had realized." Here is the commentary on the research.
[II]: Liquid Neural Networks can solve AI problems from robotics to self-driving cars.
MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a new type of deep learning architecture called Liquid Neural Networks (LNNs) that offer a compact, adaptable, and efficient solution to certain AI problems. These networks are particularly effective in areas where traditional deep learning models struggle, such as robotics and self-driving cars. LNNs use a mathematical formulation that is less computationally expensive and stabilizes neurons during training, allowing them to adapt to new situations after training, a capability not found in typical neural networks. Some excerpts from the commentary (paper)
“One of the most striking features of LNNs is their compactness. For example, a classic deep neural network requires around 100,000 artificial neurons and half a million parameters to perform a task such as keeping a car in its lane. In contrast, Rus and her colleagues were able to train an LNN to accomplish the same task with just 19 neurons.”
“LNNs, on the other hand, appear to have a better grasp of causal relationships, allowing them to better generalize to unseen situations.”
[III]: Don’t waste Food.
This is a beautiful, well-articulated infographic essay on food waste and its impact on the environment, by Reuters. The article emphasizes that humans waste a staggering one-third of the food they produce, which contributes to greenhouse gas emissions. The EPA's "Food Recovery Hierarchy" provides a guideline on how to manage leftover food, with solutions ranging from consuming the entire produce and animal to composting and using food waste for energy production. The piece also highlights the importance of rethinking landfill practices, as they contribute to methane emissions, a potent greenhouse gas.
[IV]: What is Classical Theism?
I have probably said this before, but it’s worth saying again, Ed Feser is one of my favorite (Christian) Philosophers. So naturally, I love reading his works, and hearing him speak. Here is an hour-long podcast where he addressed the question: What is Classical Theism.
“What is classical theism? What makes it distinct from theism in general? How should we answer common charges that classical theism makes God cold, distant, and unloving? Is there accuracy to the charge that classical theism is based purely on Greek philosophy as opposed to Scripture? What should we make of the confusion over terminology, specifically regarding “theistic personalism”, “neo-classical theism”, “process theism”, and other labels? Dr. Edward Feser joins us to discuss these topics.”