Friday, August 12, 2016

Beetle Larva Cocoon, Backups and Pi Failures

Noticed this Beetle Larva (not sure, do you know?: ) forming a cocoon over the past few days, at this point it enters the Pupa stage where it will only be able to wiggle! I'm not sure how long this will last, and since it's in the compost bin I have to decide whether or not to disturb it (by turning the compost), or leave it be to observe!

I'm curious, over a ~6 hour period there is a significant color change in the Larva/Pupa. It's not clear to me what causes this. There's documentation about this with Butterfly Larva > Pupa transitioning, and I'm guessing it helps the Pupa / Cocoon to blend in better with its surroundings, but what process is causing this?

The Larva measures approximately 17mm x 2mm, and is buried about 20cm into the compost bin. Another reason why using a scanner is nice, we can make quick/easy/accurate measurements based on pixels.

Unrelated to the soil life, I've had two amusing technical issues over the past few days:
One: While I was toiling around with a script it looks like I deleted almost a day's worth of images. That's frustrating, but the timing of it is somewhat appropriate. I'd purchased a 1 TB drive awhile back to store images on but hadn't hooked it up in quite some time.

I think it would be appropriate to have the Pi save images to both the SD Card and the Hard Drive,  and once a day tar+gzip the files off to Amazon S3. This would afford me the option to play with one set of images knowing that if I did something horribly wrong I have a local backup.

Two: Around 1:30am yesterday the Raspberry Pi Model 2 B I was using croaked. I'm not sure what the status is beyond it not getting on the network (Hooked up via Ethernet), and an SD Card that was working in another Pi provides the same result. I'm still without an HDMI Monitor here, so everything is done headless, I'm currently unable to see what might be happening in the boot process. Thankfully a second Pi was ready to take its place.

Having a $35 computer in my case means it's easy to have a backup Pi. Thank you Raspberry Pi Foundation : )

Lessons here:
 - Have a process for swapping the Pi out when something goes wrong
 - - Backup SD Card
 - - Backup Pi
 - Have some notification that things are not as they should be
 - - Pi going off the network is not a huge deal, though is a symptom that the Pi may be down
 - - Not sure if there's a better way to alert me that there may be severe issues.
 - Have a process for saving images to two separate locations
 - - SD Card
 - - External USB Drive
 - Have a process for saving scripts in the same way
 - Automate Amazon S3 Sync of files.

Saturday, August 6, 2016

Fungus, Frame Rates and Glaciers

Radiolab has a wonderful podcast on trees, roots, and fungus. It's a delicious introduction that really puts into perspective how interconnected everything is. As in Trees eat Salmon. Don't believe me? Listen to the podcast or read this article.

The same day I listened to this I noticed a few bright white growths in the compost, neat! There's been lots of fungal growth in the compost pile, but I haven't specifically seen this one yet, at least not in these concentrations.

Fungus doesn't move as fast as the bugs we're seeing. I could probably capture one scan per hour and feel pretty good about capturing movement in fungus. But I'm capturing a scan every 15 minutes, and the video playback at this rate causes bug movement to look like a hastily recorded stop motion animation.

Frame Rates!
So I changed the scan to run every 2 minutes, and I've captured a day's worth of footage.

Prior: 1 scan / 15 minutes = 96 scans a day, played back at 30 Frames Per Second = 3 second video.
Now: 1 scan / 2 minutes = 720 scans a day, played back at 30 Frames Per Second = 24 second video.

While I like the smoother movement, I don't necessarily like how long the video lasts. Ooh, but screens these days can all play back at 60 FPS.

60FPS: 1 scan / 2 minutes = 720 scans a day, played back at 60 FPS = 12 second video. Looks even smoother, and plays back faster!

30 FPS
60 FPS
I had an option to either re-encode the video from images at 60FPS, or convert and re-encode the video at 60 FPS. I had not done the latter before, so:

avconv -i sc_20160805.mp4 -f rawvideo -b 50000000 -pix_fmt yuvj420p -vcodec rawvideo -s 768x1080 -y temp.raw

avconv -f rawvideo -pix_fmt yuvj420p -s:v 768x1080 -r 60 -i temp.raw -c:v libx264

Not sure what I'll do in the future, for now I'll just continue capturing and encoding videos at 30 FPS.

This is really all about storage.  I'm capturing a 600 DPI image once every 2 minutes. It's saved as a JPG which greatly reduces the storage space (with some image loss), but still costs 7MB / image.

7MB / image
720 images / day
5 GB / day of storage.

The Raspberry Pi has a 32GB SD Card on it, of which I can use ~28GB. That's only 5 days of storage. Not to mention the terabyte or so of storage I've already offloaded over the past couple years of doing this.

Amazon S3 is a pretty easily accessible storage method, it costs $0.03 / gigabyte. Right now I have around 300GB's on it which costs ~$9 / month. Given the increase in scanning frequency and resolution, I'll increase that by about 150 GB / month (5GB / day * 30 days).

That means I'll be paying an additional $4.50 (150 GB * .03 / GB) every month. Not so bad until a year goes by and now I'm paying $54 / month and increasing.

But I just realized there's this thing called Amazon Glacier, which is sloooooooooooow for retrieving data, but only costs $.007 / GB, and I can setup S3 to automatically dump any files that haven't been used in X days into Glacier storage. This sounds quite ideal. I'd love to here if there are better storage options. I do have a local drive I can store stuff on, but I trust Amazon a lot more than I trust my cheap drive.

Yay for Glaciers!

Thursday, August 4, 2016

7 Day 4k Test, Broken Images, Jitter

A seven day 4k Test. I'm not sure how well this works as I've yet to actually view it at 4k resolution on a screen that supports it. If anyone out there does, let me know how it looks ;)

The workaround I came up with for removing broken images so far has been pretty reliable. Out of 74 days a total of 7104 images have been captured and 329 were removed. 7104-329 = 6775 images, but somewhere I'm missing 20.

Jitter, with the Canon LiDE 20 scanner it would not always start in exactly the same location. This wasn't very noticeable at lower resolutions, but when zooming in on a higher resolution scan and flipping through a few images the jitter would become distracting.

Though still noticeable it's not nearly as bad as it was.

Tuesday, August 2, 2016

July Compost Bin!

The full video of July's compost bin can be seen below. I don't think we've ever had any sort of schedule for turning the compost, but now that I can see it I've started turning when the bulk of it is a delicious brown color...  I should read up on composting...

Mushroom caps seem to pop up regularly a week or two after the turning, but I turn the compost too quickly to see what might be feeding upon the mushrooms/hyphae. Fungus being one of the most important decomposers in our world, I'm curious to see what feeds upon it. It took me a moment to remember that we (humans) are one of those creatures, though I won't be capturing that image with a buried scanner any time soon. Apparently some forms of fungus feast upon other fungus, another example of just how amazing fungus is.

I'm still scanning at 600 DPI which gets me super excited to see what new things can be seen that weren't before.  I'm a little less excited by the data storage requirements...  If all goes well in a few days I should have a week of scrolling footage at 4k : ) Anyhow, last month's video at 768 horizontal pixels:

Thursday, July 28, 2016

More Compost and a brief 4K Video Test

19 Days of Compost featuring Tomatoes, Carrot Tops, Corn Husks and Mushrooms. It's interesting to see so many mushroom caps growing up within the compost. For some reason I had assumed these would only be featured above in the sun light. Mushrooms don't photosynthesize, so that assumption was a poor one....

The scanner has been running this entire time at 300 DPI, which gives me an image width of approximately 2550 pixels (8.5" * 300 pixels per inch). The 4K UHD standard is 3840 pixels wide, and I'd love to see a video like the one above done at 4k. So I've upped the scanning resolution to 600 DPI, and will capture that for hopefully a few weeks. A 3 second test of that can be seen here:

Unfortunately, I don't have a display capable of 4k. Hopefully this is setup properly!

Tuesday, June 21, 2016

30 Days of Compost + Panning Video + Removing 'bad' images

1. This is 2743 images taken over ~30 days.

2. A scan happens every 15 minutes, that means I should have ~2,880 images.

3. Sometimes the scanner jams halfway through a scan. This isn't horrible, but the resulting image has a streak of repeating pixels starting where the scanner jammed. This is particularly jarring when images are played back as a video.

4. I liked the effect of panning across the images (see earlier post), but wanted to clean things up + timestamp the images.
So two short scripts are run:
Clean Images:
for file in *.jpg
        convert $file -crop 768x1+0+10 image_a.jpg
        convert $file -crop 768x1+0+30 image_b.jpg
        if [ $(compare -metric RMSE image_a.jpg image_b.jpg NULL: 2>&1 \ | sed 's/.*(//' | cut -c1-4 | cut -c 4-) -gt 0 ]

                echo "$file keep"
                echo "$file remove"
                mv $file brokenimages/
rm image*.jpg

Left image bad. Jammed about 3/5ths through the scan. Right image good!

For every jpg image in a directory we'll create two temporary images, a single row of 768 pixels located 10 and 30 pixels down from the top. We'll use imagemagick's compare tool to get a rough idea of how different they are, this feedback looks like:

350.361 (0.00534616) for images with little to no difference

2145.33 (0.0327357) for images with some difference

I'm using BASH, it doesn't like floating point numbers. We get lazy and grab the second number past the decimal point. If this number is > 0, we can assume we're dealing with two images with enough difference that it's good to keep. This could fail, but so far it's worked with ~3,000+ images. It's not particularly fast, taking approximately 3 seconds / image on a Pi 2 Model B. Running it retroactively isn't fun, but checking images as they are scanned (once every 15 minutes) is plenty fast.

In order to get the panning / scrolling affect, I crop each image down to 1920x1080 pixels starting at the top. With (almost) every image I shift the cropped area down 1 pixel. I like the idea that I can slowly pan through the entire length of the scanned area in the time it takes to run through each image, but I have more images than pixels height to move through:

2743 images = 2743 pixels to shift.
3484 = original image height.
-1080 to account for the video height
2404 = 3484-1080. Total number of pixels I need to shift down.

If I move down one pixel for each image, we'll be 339 pixels below the bottom of the scanner. I could try shifting partial pixels but that involves more work. Again laziness, so instead we just don't shift every 6th image. This isn't exact, but it gets us close to the bottom of the scanner area.

Is it noticeable in the video? : )

Cycling through all the images in a directory, we grab the date and time from the file name and set that aside as ndate & ntime. Using the imagemagick tool convert we crop a 1920x1080 chunk out of each 2480x3484 image starting at the top, shifted 280 pixels in from the left. We add a small 400x30px background bottom center and pop the $ndate and $ntime values as a text caption on top of this background.

Each image processed we increment the value $a by 1, if $a is < 6, we shift the cropped location down 1 px. If the value of $a is equal to 6, we don't shift the the image down and we reset the value of $a to 0.

This image is saved with the prefix "temp_" and a four digit number starting with 0000. Saving in this naming format makes it easy for us in the next step to wrap these images up in a nice mpeg video file.

for file in *.jpg;
        counter=$(printf %04d $count);
        convert $file -crop 1920x1080+280+$shift - | convert -background '#0008'   \
                -gravity center \
                -fill white     \
                -size 400x30    \
                -pointsize 24   \
                -kerning 2.5    \
                -font Courier   \
                caption:"${ndate} ${ntime}"     \
                -       \
                +swap   \
                -gravity south  \
                -composite      \

if [ $a -eq 6 ];
        let shift=$shift+0
        let shift=$shift+1
        let a=$a+1

echo -e "$file\t$count\t$shift"

And then we need to turn these images into a video file:
avconv -y -r 30 -i temp_%04d.jpg -r 30 -vcodec libx264 -crf 20 -g 15 CompostCam30Days.mp4

This probably took an hour to process on a ~6 year old Mac Laptop. It could be done on the raspberry pi, but the converting and timestamping of images would have taken longer.

Nothing too crazy here, all the hard work has already been done by ImageMagick and avconv.  But it's fun, and I should probably turn the compost : )

Friday, June 3, 2016

Guide Posted and More Compost Cam!

Posted a guide on how to make a SoilCam / Rhizotron! Lots of things I would love to add / edit / modify, but I think it's at a point where not posting it now would mean it never would : )

Apparently ffmpeg and avconv had a security hole in their concat command. This confused me for longer than I'd like to admit before I realized they had dropped the command in a newer release. Thankfully with mpeg.ts files you can simply cat them together and then convert.

The Compost Cam continues to run and I've captured about two weeks of data. The broken images make it a bit jarring, cleaning those out is pretty easy, but even with I still find it fascinating to watch what's happening with my food!

The jarring is a result of the scanner jamming, presumably due to too much pressure from the soil pressing in on it. Al has recommended a few modifications (enclosing it in a half cylinder), I wonder if it's pressure on the edge, or the back where a plastic linear gear rail resides that causes this? 

The scanner is mounted so the scan travels down, perhaps that orientation allows the gears to slip? 

Either way, figuring out how to make the scanner last longer and skip less would be the ideal next steps for this whole project....