Site original : Chromic

⇐ retour index

Mise à jour

Mise à jour de la base de données, veuillez patienter...

Aqueous Radio

vendredi 17 novembre 2017 à 06:00
Automated Meter Reading (AMR) module

"I wonder what that thing is…"
Is what I thought when I saw the "thing" in the picture above stuck to our house when we moved in almost four years ago. I then promptly forgot about it as I got busy with other things.

Last summer however, I was walking around the expanding neighborhood near construction sites. I must've been in a particularly inquisitive mood and my eyes happened to focus on one of those "things" on another house. I decided that the mystery needed to be solved so I walked back home and did a bit of research (fancy way of saying I ran two or three web searches).

As most people already know, that thing is an Automated Meter Reading (AMR) transceiver device for utility meters. This specific one is from the 100 Series by Itron.

Cool Story, Bro

Mystery solved, right? Nope! Now I need to know how it works. And since it's wireless technology, now I need to know if there's a way for me to tap into it and see what kind of information I can get from it.

So I put on my robe and researcher hat, and hit up the good old search engine again. Not long after, I find that somebody wrote a software defined radio (SDR) receiver for Itron compatible smart meters. Nice!

I do a bit more research and end up getting this USB receiver thingy (RTL2832 + R820T) by NooElec. RTL2832 + R820T by NooElec

Things are Happening

With the above software/hardware combo I was able to receive data on the console, which is great, but wanted a way to get an overview over time.

Around that same time, I had recently setup a "influxdb + telegraf + grafana" stack to gather SNMP data from by Ubiquiti devices (the instructions on how to do this are here in case anyone's interested). So I thought using that same stack for the water meter data, somehow, would make the most sense.

At some point I stumbled upon this gist that does exactly that. A few tweaks here & there and I have water consumption data in graph form! Water consumption data, per day, in graph form

Running youtube-dl on Android via Termux

mardi 16 mai 2017 à 06:00

I recently got youtube-dl working on LineageOS nightly via Termux. Here are the steps I took:

  1. In Termux, install some of the tools we’ll need:
    packages install python wget vim
  2. Create a “bin” folder in the home directory: mkdir bin
  3. Add the directory to your path at login: vim .bashrc
    export PATH=$PATH:~/bin
  4. Exit and re-launch Termux to update your path: exit
  5. Download youtube-dl: wget -O bin/youtube-dl
  6. Make it executable: chmod u+x bin/youtube-dl
  7. Change the interpreter path in the youtube-dl file: vim -b bin/youtube-dl
    #!/data/data/com.termux/files/usr/bin/env python
  8. Try it! youtube-dl [URL]


mardi 4 avril 2017 à 06:00

I've transferred the domain from Dreamhost to Hover. Mostly because of UI/UX reasons.

Back in 2009, when I was looking to get a domain, I chose DreamHost for Shared Hosting. They’re also a registrar, so that’s who I registered the domain with.

A couple of years later, I moved the hosting part to Linode and kept DreamHost as the registrar. Linode doesn’t do domain registrations, and I never had any problems with my domain over at DreamHost so I wasn’t really looking for alternative registrars.

Recently, I started looking into the possibility of implementing DNSSEC, and while DreamHost supports it (as long as you don’t use their nameservers), I couldn’t find the options (DNS glue, DNSSEC fields) myself in the account settings. While hunting around for those, I realized that the vast majority of everything I was seeing on the screen were menus/options completely useless to me since I’m not hosting anything with them.

Then, I came across an article that suggests I need to create a support ticket to have support complete the configuration. Blech. What is this? 1997? I never had to deal with DreamHost support much, and the couple of times I contacted them they were quick to respond and helpful, but I’d very much rather have the option to do this on my own.

I did a couple of searches to see what my options were and decided to move to Hover. They don’t do hosting, their UI focuses on domain-related features, and the glue/DNSSEC settings aren’t hidden away multiple levels deep or completely missing.

More importantly, I can set those up myself. Win!

The transfer itself went without a hitch. No suprises, no downtime, and ultimately transparent. So kudos to both DreamHost and Hover for making that happen painlessly.


samedi 4 mars 2017 à 06:00
I just renewed the domain for another two years:
Chimo Chimo
Boop! Renewed You guys are stuck with me for at least another two years.

The act of renewing the domain kind of felt like the beginning of a new year, somehow. A good time to reflect on what I've done with the domain since last time I renewed it, to think about where it's going, what the future projects are.

I'm not going to lay down plans for the next two years in this post because frankly, I haven't thought about it that far ahead. But I can try to give an overview of the plans I have for the near-future. Maybe publishing those plans here is going to give me a reason to get to them faster than I normally would.

Serving DNS

Right now, I’m hosting a bunch of stuff (including this blog) on Linode, and I’m using Linode’s nameservers for DNS. Everything’s been working well, but I want to try and run my own DNS server for my (sub)domains. There are a couple of reasons for that. One reason is “Project Autonomous” driven: don’t rely on third-parties whenever possible. The other reason is because Linode’s nameservers don’t support DNSSEC, which brings me to my second item in my plans.


I want to implement DNSSEC. I’m not sure what else to say about this other than “because why not!”. I think it’ll be interesting and I might learn a couple of things along the way.


I also want to implement DANE, for the same reasons listed for DNSSEC, really.


The * “environment” has grown and evolved and changed over the years. When I started fiddling around with all the platforms and things on here, I wasn’t aware of the Indieweb’s POSSE, webmentions, etc. I kind of wedged webmentions on here at some point, but I’d like to make Indieweb a first-class “citizen” of this place.

For example, I can receive webmentions, but it’s a bit of a pain to reply or send them at the moment. I do have plans and tools in mind to fix this problem, and I hope to get to it soon.


[ update: This is done!]

I don’t know why, but I haven’t enabled HSTS on my (sub)domains yet. I want to do that.

Content Security Policy Headers

I want to enable restricitve CSP headers on all subdomains.

Enable Brotli Compression via nginx

[ update: This is done!]

I want to enable Brotli compression on my nginx server. I need to grab the module and compile nginx against it. Not a big deal. I might use the Arch Build System (ABS) for that. I don’t know yet.

Home network (unrelated, bonus item)

This has nothing to do with the domain renewal or the Linode VPS, but since I’m making a list of things I want to do I’m adding it here.

I recently got a couple of network devices that allow me greater control over my network (compared to the generic ISP-provided router/modem I was using before) and I need to sit down and configure my home network with the ideas I have in mind.

Once that’s done, I’m going to have a blog post dedicated to that configuration on here.

Lifestream Architecture

lundi 15 juin 2015 à 06:00

Recently, the homepage of this blog changed from a static list of recent blog posts to a realtime stream of updates (the list of blog posts is still there, moved to the sidebar). This post tries to document (mostly as a "note-to-self", again) how this "lifestream" works.

Note: This is a work-in-progress; I've already got plans to tweak how some of the pieces fit together.

An overview of the pieces used to build the update stream. Description follows.


My sources are PuSH publishers, I have a PuSH hub running on the server, the lifestream is a PuSH subscriber.


For this project, I have three requirements:

  1. Realtime: I want the updates to appear in realtime. No polling. To acheive this, we rely on the Pubsubhubbub (PuSH) protocol.
  2. Self-hosted: Everything needs to run on a machine/VM I own/rent (most of this stuff is on a Linode)
  3. Open Source / Free Software: Everything I run needs to be F(L)OSS

If we refer to the image above, the information flows from left to right, with the exception of the dashed line which we'll talk about a bit later.

Pushing Updates (PuSH publishers)

The Blog

The Blog is the most "manual" source at the moment. It runs on Jekyll and to publish a post, I push to my git repository on my Gogs instance (more information about this setup here).

After the new post has been published, I run a bash one-liner script that pings the PuSH hub:


curl -d hub.mode=publish -d "hub.url="

Yes, I do plan on automating this at some point.


I run GNU mediagoblin as a media publishing platform. It's PuSH-enabled out-of-the-box so I don't have to do much here. I did, however, modify the Atom Feed generated by gmg so that it includes a direct link to the media file (so that I can include it in the lifestream).


I run Gogs on This one is the most hacky-cobbled-together source in the lifestream (so far!). Ultimately, I'd like to have the public activity events pushed to the stream, but right now only "git-push" events are sent over. There are a couple of reasons for this:

  1. Gogs isn't PuSH-enabled so I need to ping the hub myself
  2. Gogs is written in Go, which is a compiled language, which means I can't just open files and tweak them. And, I've been too lazy to setup a Go/Gogs development environment so far.

Given this situation, I have to use the features Gogs support out-of-the-book. Luckily, Gogs supports webhooks on git-push events. You can probably see where this is going: I've setup all my repository to execute a PHP script whenever I push code:


$gogs = $config['gogs'];
$data = file_get_contents('php://input');

// Invalid payload
try {
    $json = json_decode($data);
} catch (Exception $e) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');

// Invalid 'secret'
if ($gogs['secret'] !== $json->secret) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');

$ch = curl_init($gogs['push']);
curl_setopt($ch, CURLOPT_USERAGENT, 'gogs webhook script');
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'hub.mode=publish&hub.url=' . $gogs['topic']);
$ok = curl_exec($ch);

if (!$ok) {
    error_log('Could not ping hub (errno: ' . curl_errno($ch) . ')', 4);
    error_log('Curl error message: ' . curl_error($ch) ,4);
} else {
    error_log('Pinged hub successfully');

This is also on the "to improve" list.


This one is also easy. I use GNU social as a microblogging platform. It's PuSH-enabled out-of-the-box. I don't need to do anything here.

Note: There are a couple of event types that aren't PuSH'ed (e.g.: favorites) by the GNU social platform.


I added a small curl snippet to /nixtape/1.x/submissions/1.2/index.php and /gnukebox/submissions/1.2/index.php to ping the hub anytime I scrobble anything.

The Hubs

I run an unmodified instance of Switchboard as a PuSH hub. This is where the PuSH publishers send the "ping" whenever they publish something, with the exception of GNU social.

GNU social, in addition to being a PuSH-publisher out-of-the-box, has its own built-in PuSH hub.

The Subscriber

I wrote a quick NodeJS "app" that subscribes to the publishers above through the Switchboard Hub or the GNU social hub. When it recieves notification from the hub that content has been published, it fetches the feed/page it's subscribed to (this is the dashed line in the image above), gets the latest update, parses it and saves the data to a database.

It also notifies the lifestream of new events through websockets.

The Lifestream

Finally, we have the lifestream which is a simple PHP script that gets previously saved events from the MySQL database, and new events through websockets.


I'm planning on adding more sources to the stream as time goes by. Polishing how everything works is also an ongoing activity.