The Lamb in Winter

Snow Drops and Honey Bee

Snow Drops and Honey Bee

January is leaving like a lamb, but then it really hasn’t been that rough a winter. We went cycling yesterday and I definitely overdressed for the day, if not the season. Pictured are snow drops, found in Tower Grove Park and in the upper left-hand corner is a honey bee, which probably flew over from the neighboring botanical gardens. We heard on the street that the magnolias are blooming too. On Friday, we had Rep tickets, “The Lion in Winter” that tale of a royally dysfunctional family, Henry and Elinor, Richard and John and that other son.

Seeing the Light

Muybridge Race Horse Animation

Muybridge Race Horse Animation

Last night, the WashU Science-on-Tap lecture series restarted for 2016. The speaker was Dr. Lihong Wang and his talk was titled, “Compressed Ultrafast Photography”. He is the creator of the world’s fastest 2D receive-only camera that can capture light propagation at light speed, but he started his talk off much more slowly with Eadweard Muybridge’s galloping horse movie. At the Palo Alto racetrack in 1878, Muybridge took eleven cameras, each with a string attached to their shutter and then strung taught across the track. A galloping horse came riding by, firing in sequence each camera’s shot. I’ve downloaded the resulting animated GIF from Wiki. Interestingly, the Wachowski brothers would revisit this photographic technique in their Matrix movies with their so-called “bullet time” effect. Wiki describes this visual effect as detaching the time and space of the viewer from that of the subject. In many ways Dr. Wang ends up doing something similar.

Bullet through Apple, Harold Edgerton, 1964, Smithsonian

Bullet through Apple, Harold Edgerton, 1964, Smithsonian

What exactly does Wang and his team do? They make movies, like Muybridge and Wachowski. Their movies have frame-rate of 30-60 frames-per-second. Wang’s movies refresh between a billion and a trillion times per second. This is fast enough to watch light move across the screen. The speed of light is about 9” per nanosecond. The light source for his movies comes from a laser that is fired for just one trillionth of a second, a picosecond. It looks like a little red blob that in different movies is either reflected or refracted and in a third two blobs race each other, the faster one in air and the slower one in plastic, illustrating the different speeds of light that different media have. In his newest and best movie, he has sped up his game by 1000 times. Here his pulse streaks across the screen and strikes a Phosphorescent screen that then begins to glow. This movie got him the cover article of Nature.

What I find most fascinating about Wang’s research is that he makes these movies with off-the-shelf equipment. His digital camera is similar to the one in your phone. The events that he is photographing occur so quickly that there is no time to get any information off the CCD chip that is your digital camera’s eye. So, the chip’s onboard memory size becomes the major limiting factor to what he can do. He invokes some rather fancy math to help alleviate this limitation. A CCD chip is a 2D array that is designed to record the x and y dimensions of an image. What Wang does is have his CCD record a single x-line of pixels from the image and then use the y-dimension to record the time history of that single x-line. What he gets is a 1D image and its time history. Then in true Muybridge and Wachowski fashion he gangs together a bunch of these CCDs, each one oriented to get a different y-line of the original image and the result is a movie that is traveling at light speed and a career traveling faster.

Rising from the Primordial C

Broken Computer

Broken Computer

“Computers are like Old Testament gods; lots of rules and no mercy”, once wrote Joseph Campbell. I was reminded of this saying, earlier this week at work. I’d been struggling with one of our homebrewed codes. Part of my problem was that I was trying to use the code for something that it had not been designed for, but through persistence and trickery, I was able to bend it to my will, almost. On the final step, when I asked it to output my files, it would invariably and immediately crash. I struggled with this for the better part of two days. Finally, I called in the cavalry and within a few minutes the problem was fixed. Allow me to geek-out here for a moment. This software had originally been developed to run on UNIX, where file names are case-sensitive. It had been ported to and I was running it on Windows, where file names are not case-sensitive. My file name was lower case, except for the suffix, which was upper case. Making the file name all lower case fixed the problem. Some vestigial scrap of code had not kept up with the changing times, gotten confused and brought the whole house down upon it. It was all so perfectly clear, in hindsight, but debugging always is.

I have been debugging software for 45 years now or most of my life. I started in high school, studied it in college and have worked in software ever since. In high school we would record our programs on one-inch wide green perforated paper tape. When not in use, we would roll up our paper tapes and store them in repurposed 35mm film canisters. The more geeky members of my cadre fashioned Batman like utility belts and wore their software around their waist and then paraded around the school, without the benefit of any secret identity. From paper tape, I graduated to punch cards and from there to magnetic tape, CDs and now the Cloud. I used to code software all day long, but now not so much. I have learned that when you work for a company that uses software to make its products, you are more handsomely rewarded for using software to make that stuff than for writing the software in the first place. It happens that some of the files that I was able to extract was some source code that I’ll now be able to use to make more stuff.

Steve Jobs once said, “Computers are like a bicycle for the mind.” This was an apt analogy when he said it in 1990. An athlete can travel further and faster on a bike than on foot, like a person with a computer can multiply their productivity. Still, this is all hard work, requires skill and is not for everyone. In 1990, the Internet was little more than a gleam in Al Gore’s eye. Home computers were still rare and expensive. I didn’t own one then and in 1990 my relationship with computing hadn’t even half begun. Fast-forward to today, where everybody has a hand-brain in their pocket that is way more capable and powerful than the rooms full of mainframes that I had begun using many years ago. In my life, computers have enjoyed a transformative revolution. I can only wonder where in the future, we and our creature will go next.