Storm Surge

Thanks to the excellent data/interface provided by NOAA (click here for Montauk data), I was able to throw these plots together.

I have family on Long Island. As I understand it, if Hurricane Sandy’s storm surge is more than ~6 feet above high tide, the house will flood. So, I’m interested in the current ocean level. It’s still early in the storm, but the plots don’t look good.

For realtime updates, click the NOAA link above.

Plots updated 10:45 AM PDT 10/30/2012, final update. Status of the house is unknown, but everyone’s fine. Note small after-surge that’s happened today. Neat.

Measured sea level at  The Battery, Montauk, Kings Point, and New Haven during Hurricane Sandy. Data are NOAA measurements. ‘MLLW’ is “Mean Lower Low Water”

Good luck to everyone on the East Coast!

More HN numbers

I’m curious whether the Hacker News post will lead to long-term increased traffic.

Edit: 10/30/2012. After the post regarding Hurricane Sandy, this experiment had to end; traffic from the new post started to matter. These numbers are the total hits to all-things measuredmass, not just the Google post. So, the above is the final plot. 

By October ~26 (that spike) traffic was coming in bursts from various countries, especially Germany. I’m not sure that the >10 hits/day rate would have remained sustainable for more than another few weeks. Time will tell. 

Hacker News

140 countries in a day! 30,000 hits!

Whoa.

I didn’t expect that post to reach the top spot on HN, let alone stay there. It would’ve been neat just to ping around on the front page for a bit before falling to obscurity. I’m glad you’re curious about it too.

Now I’m curious about you : ).

Continue reading

Google Homepage Size Over Time

This afternoon, I looked at the source of the main Google page with my browser. It spans my whole monitor and takes 15 pushes of the scroll-wheel to see it all. Whither minimalism?

Off to the Wayback Machine. One must be careful to avoid including Wayback Machine Javascript and comments when assessing file size. I think I got it right. File sizes were assessed with `ls -l`, and included only the size of the ‘.html’ file, no images or fanciness. Dates were sampled pseudo-regularly, trying to get about two points/year.

6 points corrected. Thanks ‘qxcv’! Archive, curl, and browser data noted. Thanks ‘zzzwat’!

Time for a log plot. The difference between a browser-acquired page and a simple curl request is an order of magnitude.