Collected Thoughts

A cathartic place for my thoughts.

The Meditations of a Brat Summer

2025 - 04 - 23

I turned 33 years old 5 days ago, which would generally mean that I’m old, and through conventional thinking mean I am thus largely out of touch with the world around me. As someone who constantly seeks deeper understanding, be it in laws of nature, or the mechanisms by which electricity and magnetism dance across your screen to bring you these words; I’m fascinated by the forces that drive cultural innovation. Especially in those that seem to resonate with people that have had a significantly different life experience than mine. Ever-shifting trends in artistic expression, writing, thought, music, and communication define a sort of dynamic topology, on top of which society moves. Where you are on that landscape, I believe, are your initial conditions that reflect your capacity to relate to others, each of whom is on their own unique position on the cultural terrain. Being 33 does not mean I need to have my head in the sand.

Charli XCX is an important pop culture icon with an impressive discography that dates all the way back to 2008 when she posted her songs on Myspace. Meaning she is also old (although I am 102 days older than Charli). Despite this, her real rise to the frontal cortex of the American media consciousness happened in 2024, with the arrival of the smash hit album “Brat.” The album is laced, front-to-back with an unrelenting youthful energy that almost naturally led to widespread adoption. These songs became the basis functions of the summer season, and thus the development of the “Brat Summer” of 2024 arose, which slowly (but with a large phase delay) made its way to me. I, being curious to a fault, needed to understand what this was about and where the allure was found. So in the fall, just as the leaves were turning red, and Brat summer was fading, gave the album a listen.

Marcus Aurelius is widely accepted as one of the most impactful Stoic Philosophers of Ancient Rome. His writings in “Meditations” (which is a collection of essays written throughout his life, never intended for publication) are deeply introspective, centered on human existence, and relatable. Marcus toils with the repetitive and cyclic nature of generations of mankind, places his life into a meaningful and insignificant perspective among the cosmos, and seeks connection to nature and the universe around him for the time he is alive. Despite his militaristic status, the man behind the words of “Meditations” clearly longed to do good by all of humanity and to live a life of virtue and peace. Stay with me here.

As a fan of electronic music, listening to Charli XCX did not feel like a large departure from the sorts of sounds I’m accustomed to hearing. What I wasn’t anticipating though, was the strength of the influence of Charli’s vocals and lyrics. She is front-and-center with juxtaposed themes of self-love, acceptance, being okay with her messy and chaotic nature steeped in deep vulnerability. On one end, Charli knows she’s hot and puts that energy forward so unapologetically that it somehow captures you and brings you along in a way that you can’t help but also feel hot. The opening 4 bars off “Von dutch” say it all:

It's okay to just admit that you're jealous of me
Yeah, I heard you talk about me, that's the word on the street
You're obsessin', just confess it, put your hands up
It's obvious, I'm your number one

When I hear this, I can’t help but feel empowered and positive. Critically, “Brat” repeatedly reflects on personal imperfections, uncertainty, and a sense of general sloppiness in a way that can’t help but feel relatable. “Girl, so confusing” is a blindingly clear self-report on the challenges of navigating life, love, and calling into question your own nature. I can’t help but see many parallels to “Meditations” through Charli’s work.

At some point I will write more about this, but until then, perhaps reflect a bit.

Complex from Simple

Sometimes simple things show complex behavior when you expect them not to - I find that really interesting.

Sure, it’s simple to think that that chaotic systems with many degrees of freedom would yield complex behavior; and in most cases that’s true. However, the same can be true of incredibly simple models, in ways which feel unintuitive. In a single, one-line equation, not only is chaos found, but also order. The variables seem to ebb and flow between complete randomness and absolute stability in a way that has structure and some kind of logic. How can this be?

Bifurcations originating from simple bounded population growth.

Logistic map

Interesting behavior can come from uninteresting equations. The logistic map is shown above and it can be applied to any number of applications or situations. Many people tend to think of this in an example centered on bounded population growth. Let’s think of a population of animals which can vary over time, and their population fraction as being from 0 to 1, where 1 represents the maximum number of animals which can be supported by their environment. This equation says that the birth rate (r) relates the population fraction next year (xn+1) to the population fraction this year (xn). In other words, if our animals are really good at making babies, and the environment can support it, the population will grow until it reaches a steady state where births and deaths are balanced. This is all well and good, but the question becomes, how does the ability of the animals to mate (i.e. the growth rate) impact the equilibrium population fraction in the end? Of course, if the rate is less than 1, the population has more deaths than births, thus dying out in a few generations - meaning that the stable population fraction is always zero. Above 1, the population will grow because the births outweigh the deaths, just as we suspect. However, when growth rates exceed 3, something wild happens. The population stability spontaneously splits into an oscillation between two states, one increasing with growth rate and one decreasing with growth rate. This continues until ~3.45, at which point the oscillation splits again, into four states, before splitting into 8, and so on. Until chaos happens. Complete randomness without predictability, which unintuitively and fleetingly descends into order once again. These cycles persist, on and on, between order and disorder, seemingly violating the laws of entropy until the growth rate reaches ~4. Amazingly, at a growth rate of 3.9998 the population converges to a stable fraction of 0.17, whereas at 3.9999 the population reaches close to 1, incredible instability and randomness. But just before that, close to growth rates of 3.83, the population falls into total 3-frequency oscillatory predictability with final stable points near 0.95, 0.5, and 0.15 for population fraction. Somehow, at this point, the population can only occupy one of those three states. Whereas later on, the randomness seems to reemerge. I’m not a mathematician but this is amazing to me and I love thinking about it.

wtf this is awesome

Simulation of Neurophysiology

Hi there,

I simulated many properties of neurons, including their resting membrane potential, action potential kinetics in response to injected current, as well and various other cellular properties. I was inspired to work on this code through fascination in my neuroscience graduate course on neurophysiology, and really I did all this because it’s a mechanism to help me study and learn the intuition behind cellular processes. Sometimes a background in engineering can be helpful in understanding the complex aspects of biology. This code calculates ionic equilibrium potentials with the Nernst equation, looks at membrane response under varying capacitances and resistances, and looks at passive depletion of signal across processes. It also includes original data from Hodgkin & Huxley’s 1952 paper on action potentials to simulate them in neurons under current clamp (some of that code came from University of Michigan). Currents and potentials of individual ions are simulated and able to be modulated in the program to investigate action potentials in a little more fine-grain detail.

All of this is done in Matlab, and I’d be excited if you download the code below and run the program, just don’t judge my programming abilities or lack thereof. (:

Code is located on my GitHub, here.

Simulations.png

Look for Answers Yourself.

Ever since I was a little boy, I spent a large portion of my youth in my parents’ garage trying to build things which helped me learn more about nature. From extracting and modifying red laser diodes scavenged from broken CD players, to winding speaker wire around bundles of carpentry nails to build crude electromagnets, for me there was always a sense of discovery and adventure that came from building things which revealed something about the world. This self-discovery process slowly built in me an understanding of many fundamental aspects of nature, not from books, but from direct observation.

When I was fourteen, I got my first job as a teacher’s assistant at a local community college in an oceanic chemistry lab. My favorite part of the job was not helping students or answering their questions, it was the secret experiments I’d conduct in the back of a storage room without the professor knowing. Mixing strong sulfuric acid with a host of different materials (wood, a piece of my cotton t-shirt, paper towels) taught me many of the chemical’s properties in a way which completely captured my imagination.

Two years later, after being absolutely captivated by the kitchen toaster, I decided to build my own wire-wound resistors. I raided my grandfather’s stash of nine-volt batteries which he kept (unsurprisingly) in the garage, and connected twenty of them together in series. With 180 volts at my disposal, I was able to test different resistor designs and observe their glowing-red-hot response to high voltage and current. On the fifth resistor I tested, one of the batteries I was using was under so much load that it actually ruptured, spraying smoke and liquid across my grandfather’s carpet. After he rushed in and rightfully scolded me, he looked at the set of batteries and their wiring before concluding: “Hmm, that should have worked!” I had never felt closer to my grandfather.

From a young age, intense curiosity always motivated me to learn more about the world. This feeling never really subsided or slowed down, it only shifted forms. Going from questions of physics, chemical reactions, and electronics, to questions of the brain and how it processes information, my adult endeavors clearly mirror those of my childhood. Being inquisitive is central to my life and self-identity; I imagine I’d be horribly depressed if I were any other way.

Going Anywhere is Difficult

Our brains constantly compute very complicated things in unintuitive ways to accomplish exceedingly normal and boring tasks. For example, in order to get to English class, I had to navigate here from Boelter Hall one way or another. For me to do that without a GPS device is (hopefully) trivial on a high level, but to comparatively build and program a device capable of doing that relying on only deep learning and image data, would be very challenging. People navigate by using incoming visual information to recognize patterns and landmarks in our environment, which dictate our actions and build us an internal path. But how can we do such a complex task so naturally? Researchers have found specific, individual cells in a region in the brain that have evolved to very quickly learn to activate only when you are in a single location. These cells do so by integrating visual and other incoming information, which cumulatively result in selective activation at only one place. Over time and through learning, you’re able to build a cognitive map, cell by cell, which continuously gets more or less precise based on your experiences. Although the full story is more complex, this one small aspect of spatial memory can start to describe just how difficult it is, to find your way to where we need to be.

FlamboyantNaturalHatchetfish-size_restricted.gif

Watching our Brain Think

The human brain is considered by many the most complex object in nature. Inside each of us is an organ chiseled from hundreds of millions of years of evolution, composed of billions of microscopic computational cells (neurons) that form dynamic, functional networks with one another to somehow give us our thoughts and senses. These ever-changing cognitive representations of sensory stimuli give rise to much of what makes us who we are: our memory, creativity, personality, the ability to read these words, and even to interpret my words in a (hopefully) meaningful way. In so many ways, researchers still don’t understand the precise ways in which the brain can accomplish these things, nonetheless explain how it can do so while consuming only 1/3 of the energy of a standard light bulb.

Brain cells fundamentally communicate with one another through rapid electrical signals, called action potentials. These brief blips of voltage result in the transfer of information from one cell to the next, while setting a host of complex intracellular chemical-signaling and gene-expression pathways into motion. Historically, scientists studying these events could only do by implanting sensitive microelectrodes into the brain, which are only able to sense signals from a single cell. However, recent advances in genetics and protein engineering have given rise to a new way to detect electrical activity in the brain: by seeing it.

Only during an action potential event, nearby charged ions dramatically rush into, or out of, the cell. One ion in particular, calcium, flows into the cell rapidly during an action potential, binding immediately to a calcium-sensitive intracellular protein called calmodulin. Remarkably, researchers have successfully modified the calmodulin protein without impairing its function, to include an additional domain which is able to fluoresce light only in the presence of calcium. By leveraging this calcium-calmodulin binding interaction which takes place during action potentials, one can use a standard optical fluorescent microscope to watch brain cells communicate.

To drive these proteins into cells of interest, scientists have gone a step further, by creating synthetic viruses that contain the exact genetic instruction set for making the fluorescent version of calmodulin (called GCaMP). When these viruses are injected into an organism, they integrate their genetic content with that of the cell itself, and by leveraging the mechanisms cells normally use to make calmodulin, produce GCaMP instead.

At this point, it is possible to watch neural circuits conduct computations in hundreds of cells as organisms learn, navigate, and recall memories. This genetic tool has been nothing short of transformative to the field of neuroscience, enabling a new discipline of in-vivo functional imaging which has already contributed key findings in our understanding of the brain.

BREAD: A Photo Essay.

Bread is really, really old.

Archaeological evidence suggests that primitive forms of bread began being made in Europe 30,000 years ago. That’s in the upper stone age. Before any written history, major human civilization, or technological revolution, there was bread, and people ate it.

Stack-1.gif

Why bread?

It’s $3.99 from any market …

Bread is something which has sustained humankind for thousands of years. It has been called “the staff of life” and has deep cultural and religious significance. In early 20th century Russia, the Bolsheviks lead the October Uprising on a platform promising people “Peace, Land and Bread.” In India, life’s basic necessities are "roti, kapra aur makan" or bread, cloth, and house. There are ancient hieroglyphs in Egypt showing depictions of bread and people eating it. A “breadwinner” in western societies is a member of a family which earns the vast majority of the income, presumably to buy bread. In my favorite movie of all time “There Will be Blood,” the main character successfully convinced a community to allow him to drill for oil by promising them bread.

1920px-Slab_stele_from_tomb_of_Itjer_at_Giza_4th_Dynasty_c_2500_BC.jpg

Ancient hieroglyphs contain bread.

Vertical slices are shown at the table for eating.

Capture.PNG

From the film:

“Please don’t be insulted if I speak about this [lack of] bread. To my mind, it’s an abomination to consider that to any man, woman or child in this magnificent country of ours, should have to look upon a loaf of bread as a luxury.”



Historically, bread has been a necessity for life, and because of that it interested me. Besides cultural and historical significance though, I’m also interested in bread from a nutritional perspective. One well known food scientist at UC Davis once said:

If I gave you a bag of flour and water, you could live on it for a while, but eventually you would die. If you take that same bag of flour and water, and bake it into bread, you could live indefinitely.
— Professor Bruce German

So then, there’s something fundamental to the process of baking, through the inclusion of yeast and heat and so forth, which transforms the non-nutritious raw materials, into a product which can support life. That simple observation drove me to want to start making my own bread.

To make bread, not a lot is needed. Fundamentally you only need three things.

  1. Flour

  2. Water

  3. Salt

Not included in this list is the critical ingredient that makes bread light, airy, and flavorful: the yeast. Historically, bread was created from natural, wild strains of yeast present everywhere in the environment, and I aimed to do the same. By combining equal parts by weight flour and water, a colony of natural yeast will form over multiple weeks. It’s critical to continually feed your colony, or “starter,” each day with flour and water, to maintain a healthy and active population of yeast.

C1.PNG

Biochemistry

A host of chemical reactions take place each time you feed your starter. The yeast actively break down the complex carbohydrates in the flour into simpler forms, using the energy gained from the process to replicate and grow. This also produces compounds that give sourdough bread its distinct and pleasant flavor.

Do this long enough, and you’ll create a starter. Mine took three weeks. Several hours after feeding, the starter will become active and many small bubbles will form as the yeast metabolizes the flour and converts it to carbon dioxide.

Once you have your starter, making the bread dough is very simple. But be aware, it takes a really long time and I recommend starting around 5:00PM, the day before. (I know, I know)

IMG_0009.jpg

The protocol.

425 grams water

120 grams sourdough starter

220 grams whole wheat flour

330 grams bread flour

12 grams salt

IMG_0011.JPG

Yeah, just mix it.

Leave out the salt for now. Keep going until your arm is numb, then keep going 5 minutes more.

Once done, take a break.

Now it’s critical you let the dough sit for one hour to relax. The proper term for this process is “autolyse” and it is really important in the formation of key molecules which contribute to the bread’s final properties. Over this hour, the water will fully hydrate the flour, thereby stimulating the enzymatic breakdown of starch and other proteins into gluten. Because gluten is highly elastic, it will trap a great deal of the carbon dioxide produced during the rest of the process, making your bread light and actually edible. After the hour, add your salt and start the process of gluten organization.

Bread-Dough-2.jpg

Stretch and Fold

Every 30 minutes for an additional 90, you need to stretch and fold the dough. This repetitive process will start aligning gluten molecules within the dough making is more elastic and able to trap produced gasses.

*not my photo

Once your gluten is arranged and organized, let the dough rest in a proofing basket. Liberally coat the inside of your basket with flour to prevent sticking. Gather up the dough using small stretch and fold motions before placing it seam side up in your basket to ferment and proof. Add a generous amount of flour over top and along the sides of the dough to limit sticking as much as possible. Place in the refrigerator covered in plastic wrap.

Now wait for 12-16 hours.

Despite being painful, it’s critical to let the dough slowly ferment in the refrigerator overnight. This gives the yeast in the dough time to slowly break down many of the complex molecules in the flour without overproofing and rupturing. Throughout the entire process, the bread is becoming increasingly more flavorful and airy - skipping it will result in some sad results. The bread you make the next morning will be worth it.

The next step of the process is to preheat your oven to 550 degrees F. You’ll be cooking the bread in a cast iron dutch oven, so place that in the oven as well to get ripping hot. One key to getting a nicely light bread is a process called “oven spring” where the cold dough immediately responds to thermal shock from the ultra hot oven, puffing up a lot. This necessitates that the oven be really, really hot. After 40 minutes of preheating, place your dough into the dutch oven and score using a razor blade or a really sharp knife. This will allow your bread to rise a lot more easily.

IMG_0022.JPG

Score confidently

10 millimeter depth is ideal

Once in the oven, reduce the heat to 475 degrees F and cook for 30 minutes with the lid on. 15 additional minutes with the lid off to develop color and a nice crust.

IMG_0029.jpg

Patience is a virtue.

Once deeply brown remove from the oven and let cool on a wire rack for at minimum 2 hours. As if you didn’t spend long enough already. Once finished, admire your work.

Bread is a staple. It’s something people eat every day and don’t really think about. But bread, I would argue, is also a way to participate in something our ancestors did tens of thousands of years ago. It’s a way to chemically convert raw materials with the help of microorganisms into something that can sustain and nourish our bodies. Lastly, making bread is a great way to escape the stresses of grad school, if only for 18-22 hours.

Modulation Instability and a solution to the Nonlinear Schrödinger Equation

Optical waves exhibit a lot of interesting properties at high intensities. Namely, material properties thought to be constant, become intensity dependent, such as refractive index. This implies that at high light intensities, glass acts more like quartz refractively, due to the atomic structure and lattice composition of the material. 

In one of my classes, I modeled the evolution of a gaussian pulse though a highly non-linear optical fiber. Under specific conditions, modulation instability may take place, a complex interplay between four wave mixing and self phase modulation. This results in a temporal sharpening of the pulse, something that occurs very infrequently in fiber optics. These effects have been observed in optical rogue waves, and in many areas of nature. 

The Split-Step Fourier method was employed in Matlab to generate the pulse profile by separately calculating the dispersive and nonlinear effects. The initial pulse begins at distance = 0 (distance through the optical fiber) and is compressed before demodulating into a quasi-CW pattern after some time. The scientific validity of the simulation is questionable, but the process of learning assoiciated with it, is not.