A while ago I updated my site to
Eleventy.js. This improved
my blogging process considerably, as Markdown is quick to write,
and especially code samples are easy to copy-paste without any escaping.
I put all my posts into one Github repository. Using Visual Studio
Code, I get nice realtime preview, and once I'm
done, I just git commit, git push on my computer, and git pull on the
server, and regenerate the site with Eleventy.
Now only issue is that one usually finds 5-10 things to change after every
post, and while VS Code has great git support built in, even the simple update
process gets a bit tedious with commands being run on SSH side. So I started
wondering if I could use the Github
webhooks
to automate the regeneration. Turns out: YES.
Simple Script to Pull Markdown from Github and Regenerate
First component is of course a small shell script to automate the git pull and
regeneration. One could do a more elaborate one, but this worked for me server-side:
#!/bin/sh
cd ~/myblog_post # here are the .md files
git pull
cd ~/myblog_eleventy # here's the 11ty site generator
npx @11ty/eleventy --input=. --output=../apps/myblog_static # hosting dir
Node.js Mini-Server for Github Webhooks
Next, I needed to set up a web server that would get the HTTP POST from
Github whenever I push changes. Here your configuration will depend on
hosting you have, but Opalstack for example
has simple installation of a Node.js application. I usually disable the
automatic restarting (crontab -e etc.), use ./stop script and run my server manually for a while to see everything works, before restoring the crontab.
If you choose to forego Github webhook security mechanisms, the code is really
simple, but in that case, anyone knowing the addess of your server can flood
you with fake push requests. So let's take the high road and use this gist to
verify Github hooks! I chose to use Polka so I needed to modify
the headers part of the code just a bit:
const { exec } = require("child_process");
const polka = require('polka');
const { json } = require('body-parser');
const crypto = require('crypto')
const port = 12345;
const secret = 'ohreally :)';
const sigHeaderName = 'x-hub-signature-256';
const sigHashAlg = 'sha256';
// Middleware to verify Github "signed" POST request
function verifyPostData(req, res, next) {
console.log('Verifying signature', req.headers[sigHeaderName]);
if (!req.rawBody) {
console.log('Request body empty');
return next('Request body empty');
}
const sig = Buffer.from(req.headers[sigHeaderName] || '', 'utf8');
const hmac = crypto.createHmac(sigHashAlg, secret);
const digest = Buffer.from(sigHashAlg + '=' +
hmac.update(req.rawBody).digest('hex'), 'utf8');
if (sig.length !== digest.length || !crypto.timingSafeEqual(digest, sig)) {
console.log('Got request with invalid body digest');
return next(`Request body digest (${digest}) did not match
${sigHeaderName} (${sig})`);
}
console.log('Verification done.');
return next()
}
polka()
.use(json({ verify: (req, res, buf, encoding) => {
// Store raw body data in req rawBody variable
if(buf && buf.length) req.rawBody = buf.toString(encoding || 'utf8');
}
}))
.get('/', (req, res) => { res.end('Hello, polka!'); }) // just for testing
.post('/myblog', verifyPostData, (req, res) => {
console.log('Article repo updated, generating site...');
exec('~/blog_eleventy/gen.sh', (error, stdout, stderr) => {
if(error) console.log(`error: ${error.message}`);
if(stderr) console.log(`stderr: ${stderr}`);
console.log(`stdout: ${stdout}`);
});
res.end(`Hello, Github!`);
})
.listen(port, () => {
console.log(`> Running on localhost:${port}`);
});
For some reason, my particular configuration had the Github authorization
header in req.headers['x-hub-signature-256'] instead of the capitalized
X-Hub-Signature-256. Go figure.
I’ve been occasionally doing backups of critical files to an external hard drive (in addition to cloud of course :). However, my nice portable drive was only 500 GB and lately I’ve pushed over 600 GB with my Nikon D7200 RAW files. Time for a new drive! Instead of boring mechanical, I noticed that the very nice NVME SSD with Adata XPG SX8200 Pro with 1 TB capacity was available nearby for just 140€ (ca. $150)!
Commercial alternatives like Samsung T5 cost around 230€ here, so I thought I’d get one of those M.2 enclosures. Unfortunately, the ones with NVME support started from 50€ up in Finnish web stores.
Ugreen to the rescue
When you have something like M.2 enclosure, you know every manufacturer actually puts Chinese electronics inside. Thus, AliExpress seemed like an obvious destination to check out. I’m bit doubtful to order actual NVME drive (there were some cheap flash drives in the past that did not actually have the reported capacity), the enclosure should be fine.
Enter Ugreen, my favorite in AliExpress store. I’ve purchased several chargers from them, many having QuickCharge functionality, and the packaging, quality and everything are always top notch. Therefore I was more than happy to find a range of NVME enclosures from them for just $15-30:
Ugreen M.2 SSD USB enclosures (from Ugreen product page)
Time to order one! Fast forward 2½ weeks of anxious wait…
Unpacking and installing SSD to M.2 enclosure
I got the NVME model which promised up to 10 Gbit/s data rates, and chose the option with extra USB cable as I don’t have USB-C ports on my MB. The package arrived a bit faster than the promised 21-25 days. See the gallery below for glorious images of various stages of setup.
Just a quick update this time: A long while ago I made a post about using Adafruit Trinket without Arduino and later converted that into a TrinketMIDI Github repository for making a MIDI device with ATtiny:
Now thanks to a contribution by Gerhard Zintel, there is now also a MIDI volume device sample code in the repo. If you want to make a MIDI volume controller, it should be pretty easy with the code as well. Enjoy!
Note: The keyboard and keycaps in this article are bought by me and not a review sample. I have, however worked with KeyboardCo in the past and like them a lot in general. But just so you know!
A new keyboard in the house! Namely the Filco Majestouch-2 TK (MX Blue) Always an exciting happening in the family. After typing happily for a couple of years with superbly compact and slim Apple Magic Keyboard (works fine with Windows btw.) at home, and with my Topre Realforce 88UB at work, I thought it would be fun to get a keyboard with the classic clickly MX blues.
My main reason to get Cherries apart from the amazing blue clicky sound is the fact that one can get a wide selection of custom keycaps, very much unlike the Topre ones where you’re pretty much stuck with the keys they came with, or maybe some with Japanese characters.
After some consultation in Geekhack, I decided that out of the options I had available (in Finland pretty much zero apart from some gaming keyboards), Filco would be a good choice. Knowing they stock it, I headed straight to The Keyboard Company website and after some deliberation opted for one in Scandinavian layout — easier to swap here in Finland if I want to switch again. The Filcos are in no way inexpensive, but knowing the amount of time I spend typing, I considered the hourly cost to be quite reasonable.
Unboxing Filco Majestouch
The delivery from KeyboardCo arrived promptly as always, and I decided to shoot a classic unboxing video. Notice the great “Code and Life” logo in the thumbnail! There are no audio comments in the video, but you can hear the clickies quite well.
As an “out of the box” experience, here are my major plusses and minuses list:
Plusses
Very solid build, the case will definitely last a lifetime
Great MX blue typing experience and satisfying sound
Compact layout, it doesn’t expand much outside the keys in any direction
It’s a “no frills” workhorse, not much more to be said!
Minuses
Standard keycaps are quite high, making a wrist support pretty much a must
There’s nothing particularly exciting or special about they look
I started writing another blog post about my new keyboard today, and when uploading the unboxing video to my YouTube channel, I realized they have no “auto white balance” option, not in their new or old video editor either. Shoot. After googling for free video editors, I settled on OpenShot.
And guess what? OpenShot doesn’t have a white balance setting either! The author himself said this on Reddit, asking for help to implement it. I was pretty shocked, as it seems like the first filter I would implement myself, and thought “surely it doesn’t take more than five minutes to implement one, right?”. So I did. Well, it took maybe 15 minutes, plus 45 fiddling with Jupyter notebooks to get PIL and numpy commands right.
The Python 3 code above basically loads an image (either local if you run it with Jupyter notebook locally, or over network), get a small subportion of it to act as a grey reference, and adjusts color channel balance with two alternate methods: RGB or YCbCr. More advanced versions should be easy to add as well.
You can view the above gist in Github or just copy-paste the code to your own / cloud based notebook to try it out:
Two weeks ago I had the chance to visit the official Raspberry Pi store in Cambridge. Apart from those living in the UK, I think not many will it that far, so I thought to share my pictures from the visit for you to enjoy (and maybe evaluate whether it’s worth the trip). Enjoy!
The Store
The Raspberry Pi Store is located in the Grand Arcade shopping mall, and on the second floor. Looks nice and official.
Naturally, it houses an excellent selection of the Pi boards. There was 3B-, 3A2, Zero, Zero W, Zero W+, compute modules, Pi 4 of course (with different memory options), all with good availability. To cheapest Zero boards were limited to 1 per customer, much like in web store. All boards had a good choice of cases as well on sale. Very nice.