The summer of 2014 will be remembered by most Fairbanksans as the rainiest summer in memory, but for those living on Goldstream Creek, it marks the first time in at least 30 years that the Creek went over the banks. It started with completely saturated ground from near-continuous rain in late June and was followed by more than three inches of rain falling during a single 24-hour period on July 1st and 2nd.
By the evening of the 2nd, the Creek had risen to the banks, washed out our 18-year old log bridge, and eventually flooded our property to a depth requiring chest waders to make it out to the back cabin. It crested in the early morning hours of July 3rd, and took the entire day to return to the confines of the Creek bed.
Here are a few videos of the flooding, including one showing our bridge washing downstream in front of the back cabin.
Since the flood receded, we cleaned up the mess, repaired the foundation under the arctic entryway, and worked to raise the bridge and lift the back cabin higher off the ground so it is no longer in danger of being flooded in the next event.
Raising the logs back up to the banks was an exercise in how to carefully lift and move very heavy things without an overhead crane. The technique is based on a pair of 4x4’s bolted together at the top to form an arch. The bipod sits on the bank and leans over the edge, with one rope winch keeping the bipod from falling into the Creek and another rope winch attached to the log through a pulley hanging from the top of the bipod. For it to work, there needs to be a greater than 90° angle at the pulley so that the action of tightening the rope connected to the load will pull the top of the bipod down (but instead, lifts the end of the log). Here’s a photo taken during the first lift.
The cabin was lifted using a pair of “railroad jacks” on the ends of the bottom side logs. Amazingly, the entire cabin could be picked up from the very ends of the logs, and in two days we had the cabin up ten or eleven inches sitting on large pressure-treated pads.
Earlier this week we had another unprecedented rainstorm that dumped more than two inches of rain in less than 24 hours, breaking the daily record, and giving Fairbanks more than our average monthly rainfall for September in a single day. The following graph (PDF version) shows the water level in the Creek following the storm.
It took just over 24 hours for the flood to Crest, and then another three days to come back down to the level it was before the storm. We observed a similar pattern in the July flooding event, but I can’t seem to find the notebook where I recorded the heights (and then depths once it was over the banks). It is quite remarkable how quickly the water rises once it begins; when we got home from work on July 2nd and the water was only a third of the way across the dog yard, I still thought the water would be contained. But it continued rising and rising. It will be interesting to compare the pattern from July to September if I can find those numbers. Having a sense of what we can expect from the Creek when we get a big rainstorm is very valuable information.
It’s the beginning of a new year and time for me to look back at what I learned last year. Rather than a long narrative, let’s focus on the data. The local newspaper did a “community profile” of me this year and it was focused on my curiosity about the world around us and how we can measure and analyze it to better understand our lives. This post is a brief summary of that sort of analysis for my small corner of the world in the year that was 2013.
2013 was the year I decided to, and did, run the Equinox Marathon, so I spent a lot of time running this year and a lot less time bicycling. Since the race, I’ve been having hip problems that have kept me from skiing or running much at all. The roads aren’t cleared well enough to bicycle on them in the winter so I got a fat bike to commute on the trails I’d normally ski.
Here are my totals in tabular form:
I spent just about the same amount of time running, bicycling and skiing this year, and much less time hiking around on the trails than in the past. Because of all the running, and my hip injury, I didn’t manage to commute to work with non-motorized transport quite as much this year (55% of work days instead of 63% in 2012), but the exercise totals are all higher.
One new addition this year is a heart rate monitor, which allows me to much more accurately estimate energy consumption than formulas based on the type of activity, speed, and time. Riding my fat bike, it’s pretty clear that this form of travel is so much less efficient than a road bike with smooth tires that it can barely be called “bicycling,” at least in terms of how much energy it takes to travel over a certain distance.
Here’s the equations from Keytel LR, Goedecke JH, Noakes TD, Hiiloskorpi H, Laukkanen R, van der Merwe L, Lambert EV. 2005. Prediction of energy expenditure from heart rate monitoring during submaximal exercise. J Sports Sci. 23(3):289-97.
- hr = Heart rate (in beats/minute)
- w = Weight (in pounds)
- a = Age (in years)
- t = Exercise duration time (in hours)
And a SQL function that implements the version for men (to use it, you’d replace
yyyy-mm-dd with the appropriate values for you):
--- Kcalories burned based on average heart rate and number --- of hours at that rate. CREATE OR REPLACE FUNCTION kcal_from_hr(hr numeric, hours numeric) RETURNS numeric LANGUAGE plpgsql AS $$ DECLARE weight_lb numeric := nnn; resting_hr numeric := nn; birthday date := 'yyyy-mm-dd'; resting_kcal numeric; exercise_kcal numeric; BEGIN resting_kcal := ((-55.0969+(0.6309*(resting_hr))+ (0.0901*weight_lb)+ (0.2017*(extract(epoch from now()-birthday)/ (365.242*24*60*60))))/4.184)*60*hours; exercise_kcal := ((-55.0969+(0.6309*(hr))+ (0.0901*weight_lb)+ (0.2017*(extract(epoch from now()-birthday)/ (365.242*24*60*60))))/4.184)*60*hours; RETURN exercise_kcal - resting_kcal; END; $$;
Here’s a graphical comparison of my exercise data over the past four years:
It was a pretty remarkable year, although the drop in exercise this fall is disappointing.
Another way to visualize the 2013 data is in the form of a heatmap, where each block represents a day on the calendar, and the color is how many calories I burned on that day. During the summer you can see my long runs on the weekends showing up in red. Equinox was on September 21st, the last deep red day of the year.
2013 was quite remarkable for the number of days where the daily temperature was dramatically different from the 30-year average. The heatmap below shows each day in 2013, and the color indicates how many standard deviations that day’s temperature was from the 30-year average. To put the numbers in perspective, approximately 95.5% of all observations will fall within two standard deviations from the mean, and 99.7% will be within three standard deviations. So the very dark red or dark blue squares on the plot below indicate temperature anomalies that happen less than 1% of the time. Of course, in a full year, you’d expect to see a few of these remarkable differences, but 2013 had a lot of remarkable differences.
2013 saw 45 days where the temperature was more than 2 standard deviations from the mean (19 that were colder than normal and 26 that were warmer), something that should only happen 16 days out of a normal year [ 365.25(1 − 0.9545) ]. There were four days ouside of 3 standard deviations from the mean anomaly. Normally there’d only be a single day [ 365.25(1 − 0.9973) ] with such a remarkably cold or warm temperature.
April and most of May were remarkably cold, resulting in many people skiing long past what is normal in Fairbanks. On May first, Fairbanks still had 17 inches of snow on the ground. Late May, almost all of June and the month of October were abnormally warm, including what may be the warmest week on record in Alaska from June 23rd to the 29th. Although it wasn’t exceptional, you can see the brief cold snap preceding and including the Equinox Marathon on September 21st this year. The result was bitter cold temperatures on race day (my hands and feet didn’t get warm until I was climbing Ester Dome Road an hour into the race), as well as an inch or two of snow on most of the trail sections of the course above 1,000 feet.
Most memorable was the ice and wind storm on November 13th and 14th that dumped several inches of snow and instantly freezing rain, followed by record high winds that knocked power out for 14,000 residents of the area, and then a drop in temperatures to colder than ‒20°F. My office didn’t get power restored for four days.
I’m moving more and more of my work into git, which is a distributed revision control system (or put another way, it’s a system that stores stuff and keeps track of all the changes). Because it’s distributed, anything I have on my computer at home can be easily replicated to my computer at work or anywhere else, and any changes that I make to these files on any system, are easy to recover anywhere else. And it’s all backed up on the master repository, and all changes are recorded. If I decide I’ve made a mistake, it’s easy to go back to an earlier version.
Using this sort of system for software code is pretty common, but I’m also using
this for normal text files (the
docs repository below), and have
starting moving other things into git such as all my eBooks.
The following figure shows the number of file changes made in three of my
repositories over the course of the year. I don’t know why April was such an
active month for Python, but I clearly did a lot of programming that month. The
large number of file changes during the summer in the
docs repository is
because I was keeping my running (and physical therapy) logs in that repository.
The dog barn was the big summer project. It’s a seven by eleven foot building with large dog boxes inside that we keep warm. When the temperatures are too cold for the dogs to stay outside, we put them into their boxes in the dog barn and turn the heat up to 40°F. I have a real-time visualization of the conditions inside and outside the barn, and because the whole thing is run with a small Linux computer and Arduino board, I’m able to collect a lot of data about how the barn is performing.
One such analysis will be to see how much heat the dogs produce when they are in the barn. To estimate that, we need a baseline of how much heat we’re adding at various temperatures in order to keep it at temperature. I haven’t collected enough cold temperature data to really see what the relationship looks like, but here’s the pattern so far.
The graph shows the relationship between the temperature differential between the outside and inside of the barn plotted against the percentage of time the heater is on in order to maintain that differential, for all 12-hour periods where the dogs weren’t in the barn and there’s less than four missing observations. I’ve also run a linear and quadratic regression in order to predict how much heat will be required at various temperature differentials.
The two r2 values shows how much of the variation in heating is explained by the temperature differential for the linear and the quadratic regressions. I know that this isn’t a linear relationship, but that model still fits the data better than the quadratic model does. It may be that it’s some other form of non-linear relationship that’s not well expressed by a second order polynomial.
Once we can predict how much heat it should take to keep the barn warm at a particular temperature differential, we can see how much less heat we’re using when the dogs are in the barn. One complication is that the dogs produce enough moisture when they are in the barn that we need to ventilate it when they are in there. So in addition to the additive heating from the dogs themselves, there will be increased heat losses because we have to keep it better ventilated.
It’ll be an interesting data set.
Power consumption is a concern now that we’ve set up the dog barn and are keeping it heated with an electric heater. It’s an oil-filled radiator-style heater, and uses around 1,100 Watts when it’s on.
This table shows our overall usage by year for the period we have data.
|year||average watts||total KWH|
Our overall energy use continues to go down, which is a little surprising to me, actually, since we eliminated most of the devices known to use a lot electricity (incandescent light bulbs, halogen floodlights) years ago. Despite that, and bringing the dog barn on line in late November, we used less electricity in 2013 than in the prior three years.
Here’s the pattern by month, and year.
The spike in usage in November is a bit concerning, since it’s the highest overall monthly consumption for the past four years. Hopefully this was primarily due to the heavy use of the heater during the final phases of the dog barn construction. December wasn’t a particularly cold month relative to years past, but it’s good to see that our consumption was actually quite low even with the barn heater being on the entire month.
That wraps it up. Have a happy and productive 2014!
The list of books for the 2014 Tournament of Books has been released. Once again, I plan to keep the list up to date with what I’ve read and whether I thought each book is good enough to win. One star (☆) means I didn’t like it but managed to finish it, two stars (☆☆) means I liked it but I didn’t think it should win, and three stars (★★★) means it was one of the better books I read this (or last) year and I’d be happy if it won the Tournament. The last several years my personal favorites going into the contest have been eliminated, but thus far I haven’t been disappointed with the eventual winner.
- At Night We Walk in Circles by Daniel Alarcón
- The Luminaries by Eleanor Catton
- The Tuner of Silences by Mia Couto
- The Signature of All Things by Elizabeth Gilbert
- How to Get Filthy Rich in Rising Asia by Mohsin Hamid
- The Dinner by Herman Koch
- The Lowland by Jhumpa Lahiri
- Long Division by Kiese Laymon
- The Good Lord Bird by James McBride
- Hill William by Scott McClanahan ☆☆
- The Son by Philipp Meyer
- A Tale for the Time Being by Ruth Ozeki ☆☆
- Eleanor & Park by Rainbow Rowell
- The Goldfinch by Donna Tartt
- The People in the Trees by Hanya Yanagihara ☆☆
- Pre-Tournament Playoff winner
Pre-Tournament Playoff Round
I’ve got a lot of reading to do between now and March, since I’ve only read two of the seventeen books chosen. Some seem like pretty obvious choices, but at least half of them are unfamiliar to me. And I just started reading The Flamethrowers, so I can’t even start on these until I’m done with that book. The good news is that all of them are available as eBooks from my local bookseller (Gulliver’s Books). That probably means they are in Amazon’s Kindle library as well.
I spent most of October and November building a dog barn for the dogs. Our two newest dogs (Lennier and Monte) don’t have sufficient winter coats to be outside when it’s colder than ‒15°F. A dog barn is a heated space with large, comfortable, locking dog boxes inside. The dogs sleep inside at night and are pretty much in the house with us when we’re home, but when we’re at work or out in town, the dogs can go into the barn to stay warm on cold days.
You can view the photos of the construction on my photolog
Along with the dog boxes we’ve got a monitoring and control system in the barn:
- An Arduino board that monitors the temperature (DS18B20 sensor) and humidity (SHT15) in the barn and controls an electric heater through a Power Tail II.
- A BeagleBone Black board running Linux which reads the data from the Arduino board and inserts it into a database, and can change the set temperature that the Arduino uses to turn the heater on and off (typically we leave this set at 30°F, which means the heater comes on at 28 and goes off at 32°F).
- An old Linksys WRT-54G router (running DD-WRT) which connect to the wireless network in the house and connects to BeagleBone setup via Ethernet.
The system allows us to monitor the conditions inside the barn in real-time, and to change the temperature. It is a little less robust than the bi-metallic thermostat we were using initially, but as long as the Arduino has power, it is able to control the heat even if the BeagleBone or wireless router were to fail, and is far more accurate. It’s also a lot easier to keep track of how long the heater is on if we’re turning it on and off with our monitoring system.
Thursday we got an opportunity to see what happens when all the dogs are in there at ‒15°F. They were put into their boxes around 10 AM, and went outside at 3:30 PM. The windows were closed.
Here’s a series of plots showing what happened (PDF version)
The top plot shows the temperature in the barn. As expected, the temperature varies from 28°F, when the heater comes on, to a bit above 32°F when the heater goes off. There are obvious spikes in the plot when the heater comes on and rapidly warms the building. Interestingly, once the dogs were settled into the barn, the heater didn’t come on because the dogs were keeping the barn warm themselves. The temperature gradually rose while they were in there.
The next plot is the relative humidity. In addition to heating the barn, the dogs were filling it with moisture. It’s clear that we will need to deal with all that moisture in the future. We plan on experimenting with a home-built heat recovery ventilator (HRV) that is based on alternating sheets of Coroplast plastic. The idea is that warm air from inside travels through one set of layers to the outside, cold air from outside passes through the other set of layers and is heated on it’s way in by the exiting warm air. Until that’s done, our options are to leave the two windows cracked to allow the moisture to escape (with some of the warm air, of course) or to use a dehumidifier.
The bar chart shows the number of minutes the power was on for the interval shown. Before the dogs went into the barn the heater was coming on for about 15 minutes, then was off for 60 minutes before coming back on again. As the temperature cools outside, the interval when the heater is off decreases. Again, this plot shows the heater stopped coming on once the dogs were in the barn.
The bottom plot is the outside temperature.
So far the barn is a great addition to the property, and the dogs really seem to like it, charging into the barn and into their boxes when it’s cold outside. I’m looking forward to experimenting with the HRV and seeing what happens under similar conditions but with the windows slighly open, or when the outside temperatures are much colder.
My last blog post compared the time for the men who ran both the 2012 Gold Discovery Run and the Equinox Marathon in order to give me an idea of what sort of Equinox finish time I can expect. Here, I’ll do the same thing for the 2012 Santa Claus Half Marathon.
Yesterday I ran the half marathon, finishing in 1:53:08, which is an average pace of 8.63 / 8:38 minutes per mile. I’m recovering from a mild calf strain, so I ran the race very conservatively until I felt like I could trust my legs.
I converted the SportAlaska PDF files the same way as before, and read the data in from the CSV files. Looking at the data, there are a few outliers in this comparison as well. In addition to being ouside of most of the points, they are also times that aren’t close to my expected pace, so are less relevant for predicting my own Equinox finish. Here’s the code to remove them, and perform the linear regression:
combined <- combined[!(combined$sc_pace > 11.0 | combined$eq_pace > 14.5),] model <- lm(eq_pace ~ sc_pace, data=combined) summary(model) Call: lm(formula = eq_pace ~ sc_pace, data = combined) Residuals: Min 1Q Median 3Q Max -1.08263 -0.39018 0.02476 0.30194 1.27824 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -1.11209 0.61948 -1.795 0.0793 . sc_pace 1.44310 0.07174 20.115 <2e-16 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.5692 on 45 degrees of freedom Multiple R-squared: 0.8999, Adjusted R-squared: 0.8977 F-statistic: 404.6 on 1 and 45 DF, p-value: < 2.2e-16
There were fewer male runners in 2012 that ran both Santa Claus and Equinox, but we get similar regression statistics. The model and coefficient are significant, and the variation in Santa Claus pace times explains just under 90% of the variation in Equinox times. That’s pretty good.
Here’s a plot of the results:
As before, the blue line shows the model relationship, and the grey area surrounding it shows the 95% confidence interval around that line. This interval represents the range over which 95% of the expected values should appear. The red line is the 1:1 line. As you’d expect for a race twice as long, all the Equinox pace times are significantly slower than for Santa Claus.
There were fewer similar runners in this data set:
|Runner||DOB||Santa Claus||Equinox Time||Equinox Pace|
This analysis predicts that I should be able to finish Equinox in just under five hours, which is pretty close to what I found when using Gold Discovery times in my last post. The model predicts a pace of 11:20 and an Equinox finish time of four hours and 57 minutes, and these results are within the range of the three similar runners listed above. Since I was running conservatively in the half marathon, and will probably try to do the same for Equinox, five hours seems like a good goal to shoot for.