PROJET AUTOBLOG


Chromic

Site original : Chromic

⇐ retour index

Blog Posts

mercredi 31 juillet 2019 à 02:00

Gitea, Drone CI, Hugo and Auto-deployment

mercredi 31 juillet 2019 à 02:00

This is an update to my previous blog post "Git, Gogs, Jekyll and Auto-deployment" since I've changed things up in the last four years.

The Static Blog Generator

I've switched to Hugo from Jekyll. A few reason are:

Publishing

I still just push to a git repository to trigger an update to the blog. I've switched my self-hosted git platform from Gogs to Gitea, however. The main reason is that, the development on Gogs stalled for a while and the community had some interesting ideas on the roadmap I wanted to play with.

The Hook

Instead of calling a local custom script on the post-receive git hook, I'm using Drone CI to run `hugo` and `scp` the results over to the live blog. You can look at the .drone.yml file in the repository for more info.

Since I'm switching everything to LXC containers, the blog and Gitea are in separate "environments" so I can't rely on the old post-receive git hook on Gitea to build the blog on-the-fly. Plus, this task is perfect for a CI pipeline and I've wanted to set something like this up for a while now. I'll probably end up using the pipeline for other things too.

Weathered Radio

dimanche 23 juin 2019 à 02:00

A while ago, I wrote about listening-in to our wireless water meter with a USB SDR. A while after that, we received a weather station that includes a wireless sensor to measure the weather outside:

Acurite weather station and wireless sensor

With the excellent rtl_433 utility, it was quite easy to get a reading off of the wireless sensor. The following script will launch `rtl_433` for 30 seconds and output the readings in JSON format. We then use `jq` to select only the readings specific to our sensor by filtering by sensor id (there are a couple other wireless sensors around it seems), keep only the last reading, do some calibration with `jq` again and write the final JSON to a file. I then point nginx to that file, serve it with a 'application/json' Content-Type and boom: local weather accessible remotely. Why? No idea, but it's kind of neat.

#!/bin/bash

# output dir
dir="/tmp"


# Run rtl_433 for 30 seconds (the sensor returns data every 16s, but whatever)
/usr/bin/rtl_433 -R 40 -F json -T 30 | \
# Filter out everything we received except data from our sensor (id=2168)
jq --unbuffered -c 'select(.id == 2168)' | \
# Only keep a single reading
tail -n 1 | \
# We subtract "3" from the temperature reading since the sensor seems to
# consistently be around 3 degrees too high
jq --unbuffered '.temperature_C = .temperature_C - 3' > "${dir}"/weather.new.json

# Overwrite the old file with the new data
mv "${dir}"/weather.new.json "${dir}"/weather.json

I use the following systemd timer/service to have the script above run every 15 minutes:

weather.timer
[Unit]
Description=Update weather info every 15mins

[Timer]
OnBootSec=15min
OnUnitActiveSec=15min

[Install]
WantedBy=timers.target
weather.service
[Unit]
Description=Update weather info

[Service]
Type=oneshot
ExecStart=/home/chimo/scripts/weather.sh

Weathered Radio

dimanche 23 juin 2019 à 02:00

A while ago, I wrote about listening-in to our wireless water meter with a USB SDR. A while after that, we received a weather station that includes a wireless sensor to measure the weather outside:

Acurite weather station and wireless sensor

With the excellent rtl_433 utility, it was quite easy to get a reading off of the wireless sensor. The following script will launch `rtl_433` for 30 seconds and output the readings in JSON format. We then use `jq` to select only the readings specific to our sensor by filtering by sensor id (there are a couple other wireless sensors around it seems), keep only the last reading, do some calibration with `jq` again and write the final JSON to a file. I then point nginx to that file, serve it with a 'application/json' Content-Type and boom: local weather accessible remotely. Why? No idea, but it's kind of neat.

#!/bin/bash

# output dir
dir="/tmp"


# Run rtl_433 for 30 seconds (the sensor returns data every 16s, but whatever)
/usr/bin/rtl_433 -R 40 -F json -T 30 | \
# Filter out everything we received except data from our sensor (id=2168)
jq --unbuffered -c 'select(.id == 2168)' | \
# Only keep a single reading
tail -n 1 | \
# We subtract "3" from the temperature reading since the sensor seems to
# consistently be around 3 degrees too high
jq --unbuffered '.temperature_C = .temperature_C - 3' > "${dir}"/weather.new.json

# Overwrite the old file with the new data
mv "${dir}"/weather.new.json "${dir}"/weather.json

I use the following systemd timer/service to have the script above run every 15 minutes:

weather.timer
[Unit]
Description=Update weather info every 15mins

[Timer]
OnBootSec=15min
OnUnitActiveSec=15min

[Install]
WantedBy=timers.target
weather.service
[Unit]
Description=Update weather info

[Service]
Type=oneshot
ExecStart=/home/chimo/scripts/weather.sh

Mail

mercredi 5 juin 2019 à 06:00

Once again, I'm just documenting part of my setup. It's nothing that hasn't been done before (it's just a collection of different utilities, barely deviating from the out-of-the-box configuration, really). This time, we're talking about mail.

MUA

I do like me some command line, so my mail client is `neomutt`. I'll be honest, I don't know exactly what the differences are compared to mutt, but it's been working well so I'm not complaining.

I'm pretty excited about `aerc` though. It has some interesting features and ideas, such as an embedded terminal and highlighting patches with diffs. Of course, the first thing I did when I installed it (after testing the standard read/send mail features) is post a notice to my GNU social instance by calling `identicurse` from within `aerc`.

Receiving mail

I'm using `offlineimap` to receive mails and save them locally. neomutt then reads the mail from there. Having the content locally allows me to run other utilities on the data without them having to be network/IMAP/etc aware. Some examples are searching (more on that later) and mail notifications via mako/swaybar/swayblocks.

Address book sync

I'm using `vdirsyncer` to sync my address book (and calendar, but I'm not using anything to display calendar data yet) from my nextcloud instance.

Address book integration with (neo)mutt

I'm using `khard` as a console carddav client and used the following settings to integrated it with (neo)mutt.

Searching mail

I'm using `notmuch` to search mail within (neo)mutt by using the following key-bindings.