PROJET AUTOBLOG


Chromic

Site original : Chromic

⇐ retour index

Mise à jour

Mise à jour de la base de données, veuillez patienter...

Home Network

jeudi 8 décembre 2016 à 06:00

Hello, folks! It's time again for that yearly blog post! It's yet, again mainly a "note-to-self" type of post.

This time I'm documenting how my home network is currently setup. In short, I have two separate internal networks: an "IoT/Untrusted" network and a "Trusted" network. In this context, "trusted" means that I installed whatever OS/ROM I wanted on the device. It's a pretty common idea; this is my experiments on the subject.

Diagram of my Home Network (click for larger view)

The Tubes

Let's start with the Internet. I connect to it like most people do: via my ISP-provided router/modem. It happens to be a Hitron CGN3. Behind that, I have a pretty versatile EdgeRouter X. I set up two different subnets on there: 192.168.1.0/24, which is the "IoT/Untrusted" network, and 192.18.2.1/24, which is the "Trusted" network. The two networks are on separate VLANs and the ER-X's firewall rules don't allow the networks to talk to each other.

The Airwaves

The ER-X doesn't do wireless, so I have a wireless AP connected to each of the subnets. The "IoT/Untrusted" wireless AP (10.1.1.0/24) is an old D-Link DIR-615 Rev. B2. I had lying around. The "Trusted" wireless AP (10.1.2.0/24) is a UniFi In-Wall AP.

The Danger Zone

The "IoT/Untrusted" network has a few devices on it: PlayStation 3, Nintendo Wii, iPhone 5S, iPad Air, Samsung Tv, Brothers Printer.

I'm planning on applying different rules for different devices on this network.

Some examples could be:
  • Don't let the printer access the Internet, only LAN
  • Limit TV internet access to Netflix only
  • Don't let the devices talk to each other (except the printer)
  • etc.

The Safespace

I have a few devices on the "Trusted" network:

The VPS

For kicks, and because it's a feature of the ER-X out-of-the-box, I set up an IPSec site-to-site tunnel between the ER-X and my Linode VPS.

I Should Blog More. Maybe. Probably Not, Though.

vendredi 26 août 2016 à 06:00

Brace yourselves, you’re in for a “existential crisis”-type of blog post. Welcome to LiveJournal. Or something, I don’t know…

Yeah so, I like to read blogs. Always have. I've been reading blogs since I learned the internet was a thing. I'm subscribed to a bunch of blogs via this archaic thing called RSS — also h-feeds since that's what the cool people are using these days :) (no, but seriously, check it out; it makes a whole lot of sense).

Sometimes, when reading those blogs, I get the feeling that I should blog more myself. Especially when those blog posts explicitly ask readers to blog more.

And then after thinking about it for about three seconds, I think maybe I should not. I'm not very good at it. Nor do I have anything very interesting to write about, really.

Blogging about my personal experiences surely can't be interesting to anyone, right? Sometimes I post notes about it. Rarely, some pictures. But a long-form blog post about me? Ain't nobody got time for that!, right? Why would anyone read that? Why are you still here, friend?

And then there's blogging about technical things. I like computers. I studied computers. I write code for a job. I write code as a hobbby. Surely I could write something about that, don't you think? Well, I could. Then I think about it for about three seconds, search the ol' 'net for about three seconds, and give up.

Thing is, when I think I have something to write about, a quick search yields a plethora of people smarter than me, who wrote better articles than I could write, often years before that idea came to my mind. So I repost, or bookmark, or otherwise share some of those articles, but I don't blog. Which wouldn't be a bad thing, except for that nagging feeling that I want to create, participate, actively be involved in something related to my field.

See, the internet has always had a soft spot in my heart, but for a few years now it also makes me feel very, very small, insignificant, underacheiving and inadequate. I read about all these people creating, writing, coding and otherwise making successful, forward-thinking, cool, original things. And I'm just sitting here making friends with spiders. Or whatever.

Sometimes I remember that the articles I read, and all those new libraries, frameworks, studies, articles, music tracks, I discover are written and built by multiple, separate people over a long period of time. And for about three seconds I feel better realizing people (usually) don't produce great things during a single lunch break between their TED talks and a flight to some other conference.

For those quick three seconds, I realize that it's not "me against the internet competing for a job position". The "internet" isn't a "average" entity to compare yourself to. The internet doesn't sleep. The internet is an expert in everything. The internet has ~7 billion brains. It's just not a fair comparison.

Yet, I go back to my restless self, writing a useless blog post at 04:08 because I can't sleep since I feel like I haven't shipped anything concrete, nothing useful, today. 2016-08-25 is gone, and I have nothing to show for it. And let's face it, this blog post is probably going to be the the most productive thing I'm going to do today. Or this month. Or year. Oh well.

Lifestream Architecture

lundi 15 juin 2015 à 06:00

Recently, the homepage of this blog changed from a static list of recent blog posts to a realtime stream of updates (the list of blog posts is still there, moved to the sidebar). This post tries to document (mostly as a "note-to-self", again) how this "lifestream" works.

Note: This is a work-in-progress; I've already got plans to tweak how some of the pieces fit together.

An overview of the pieces used to build the update stream. Description follows.

TL;DR

My sources are PuSH publishers, I have a PuSH hub running on the server, the lifestream is a PuSH subscriber.

Details

For this project, I have three requirements:

  1. Realtime: I want the updates to appear in realtime. No polling. To acheive this, we rely on the Pubsubhubbub (PuSH) protocol.
  2. Self-hosted: Everything needs to run on a machine/VM I own/rent (most of this stuff is on a Linode)
  3. Open Source / Free Software: Everything I run needs to be F(L)OSS

If we refer to the image above, the information flows from left to right, with the exception of the dashed line which we'll talk about a bit later.

Pushing Updates (PuSH publishers)

The Blog

The Blog is the most "manual" source at the moment. It runs on Jekyll and to publish a post, I push to my git repository on my Gogs instance (more information about this setup here).

After the new post has been published, I run a bash one-liner script that pings the PuSH hub:

#!/usr/bin/bash

curl -d hub.mode=publish -d "hub.url=http://chromic.org/feed.xml" http://push.chromic.org

Yes, I do plan on automating this at some point.

Media

I run GNU mediagoblin as a media publishing platform. It's PuSH-enabled out-of-the-box so I don't have to do much here. I did, however, modify the Atom Feed generated by gmg so that it includes a direct link to the media file (so that I can include it in the lifestream).

Code

I run Gogs on code.chromic.org. This one is the most hacky-cobbled-together source in the lifestream (so far!). Ultimately, I'd like to have the public activity events pushed to the stream, but right now only "git-push" events are sent over. There are a couple of reasons for this:

  1. Gogs isn't PuSH-enabled so I need to ping the hub myself
  2. Gogs is written in Go, which is a compiled language, which means I can't just open files and tweak them. And, I've been too lazy to setup a Go/Gogs development environment so far.

Given this situation, I have to use the features Gogs support out-of-the-book. Luckily, Gogs supports webhooks on git-push events. You can probably see where this is going: I've setup all my repository to execute a PHP script whenever I push code:

<?php
require_once('../../_config.php');

$gogs = $config['gogs'];
$data = file_get_contents('php://input');

// Invalid payload
try {
    $json = json_decode($data);
} catch (Exception $e) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');
    exit;
}

// Invalid 'secret'
if ($gogs['secret'] !== $json->secret) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');
    exit;
}

$ch = curl_init($gogs['push']);
curl_setopt($ch, CURLOPT_USERAGENT, 'gogs webhook script');
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'hub.mode=publish&hub.url=' . $gogs['topic']);
$ok = curl_exec($ch);
curl_close($ch);

if (!$ok) {
    error_log('Could not ping hub (errno: ' . curl_errno($ch) . ')', 4);
    error_log('Curl error message: ' . curl_error($ch) ,4);
} else {
    error_log('Pinged hub successfully');
}

This is also on the "to improve" list.

Microblog

This one is also easy. I use GNU social as a microblogging platform. It's PuSH-enabled out-of-the-box. I don't need to do anything here.

Note: There are a couple of event types that aren't PuSH'ed (e.g.: favorites) by the GNU social platform.

Fm

I added a small curl snippet to /nixtape/1.x/submissions/1.2/index.php and /gnukebox/submissions/1.2/index.php to ping the hub anytime I scrobble anything.

The Hubs

I run an unmodified instance of Switchboard as a PuSH hub. This is where the PuSH publishers send the "ping" whenever they publish something, with the exception of GNU social.

GNU social, in addition to being a PuSH-publisher out-of-the-box, has its own built-in PuSH hub.

The Subscriber

I wrote a quick NodeJS "app" that subscribes to the publishers above through the Switchboard Hub or the GNU social hub. When it recieves notification from the hub that content has been published, it fetches the feed/page it's subscribed to (this is the dashed line in the image above), gets the latest update, parses it and saves the data to a database.

It also notifies the lifestream of new events through websockets.

The Lifestream

Finally, we have the lifestream which is a simple PHP script that gets previously saved events from the MySQL database, and new events through websockets.

Conclusion

I'm planning on adding more sources to the stream as time goes by. Polishing how everything works is also an ongoing activity.

Lifestream Architecture

lundi 15 juin 2015 à 06:00

Recently, the homepage of this blog changed from a static list of recent blog posts to a realtime stream of updates (the list of blog posts is still there, moved to the sidebar). This post tries to document (mostly as a "note-to-self", again) how this "lifestream" works.

Note: This is a work-in-progress; I've already got plans to tweak how some of the pieces fit together.

An overview of the pieces used to build the update stream. Description follows.

TL;DR

My sources are PuSH publishers, I have a PuSH hub running on the server, the lifestream is a PuSH subscriber.

Details

For this project, I have three requirements:

  1. Realtime: I want the updates to appear in realtime. No polling. To acheive this, we rely on the Pubsubhubbub (PuSH) protocol.
  2. Self-hosted: Everything needs to run on a machine/VM I own/rent (most of this stuff is on a Linode)
  3. Open Source / Free Software: Everything I run needs to be (F)OSS

If we refer to the image above, the information flows from left to right, with the exception of the dashed line which we'll talk about a bit later.

Pushing Updates (PuSH publishers)

The Blog

The Blog is the most "manual" source at the moment. It runs on Jekyll and to publish a post, I push to my git repository on my Gogs instance (more information about this setup here).

After the new post has been published, I run a bash one-liner script that pings the PuSH hub:

#!/usr/bin/bash

curl -d hub.mode=publish -d "hub.url=http://chromic.org/feed.xml" http://push.chromic.org

Yes, I do plan on automating this at some point.

Media

I run GNU mediagoblin as a media publishing platform. It's PuSH-enabled out-of-the-box so I don't have to do much here. I did, however, modify the Atom Feed generated by gmg so that it includes a direct link to the media file (so that I can include it in the lifestream).

Code

I run Gogs on code.chromic.org. This one is the most hacky-cobbled-together source in the lifestream (so far!). Ultimately, I'd like to have the public activity events pushed to the stream, but right now only "git-push" events are sent over. There are a couple of reasons for this:

  1. Gogs isn't PuSH-enabled so I need to ping the hub myself
  2. Gogs is written in Go, which is a compiled language, which means I can't just open files and tweak them. And, I've been too lazy to setup a Go/Gogs development environment so far.

Given this situation, I have to use the features Gogs support out-of-the-book. Luckily, Gogs supports webhooks on git-push events. You can probably see where this is going: I've setup all my repository to execute a PHP script whenever I push code:

<?php
require_once('../../_config.php');

$gogs = $config['gogs'];
$data = file_get_contents('php://input');

// Invalid payload
try {
    $json = json_decode($data);
} catch (Exception $e) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');
    exit;
}

// Invalid 'secret'
if ($gogs['secret'] !== $json->secret) {
    header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request');
    exit;
}

$ch = curl_init($gogs['push']);
curl_setopt($ch, CURLOPT_USERAGENT, 'gogs webhook script');
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'hub.mode=publish&hub.url=' . $gogs['topic']);
$ok = curl_exec($ch);
curl_close($ch);

if (!$ok) {
    error_log('Could not ping hub (errno: ' . curl_errno($ch) . ')', 4);
    error_log('Curl error message: ' . curl_error($ch) ,4);
} else {
    error_log('Pinged hub successfully');
}

This is also on the "to improve" list.

Microblog

This one is also easy. I use GNU social as a microblogging platform. It's PuSH-enabled out-of-the-box. I don't need to do anything here.

Note: There are a couple of event types that aren't PuSH'ed (e.g.: favorites) by the GNU social platform.

Fm

I added a small curl snippet to /nixtape/1.x/submissions/1.2/index.php and /gnukebox/submissions/1.2/index.php to ping the hub anytime I scrobble anything.

The Hubs

I run an unmodified instance of Switchboard as a PuSH hub. This is where the PuSH publishers send the "ping" whenever they publish something, with the exception of GNU social.

GNU social, in addition to being a PuSH-publisher out-of-the-box, has its own built-in PuSH hub.

The Subscriber

I wrote a quick NodeJS "app" that subscribes to the publishers above through the Switchboard Hub or the GNU social hub. When it recieves notification from the hub that content has been published, it fetches the feed/page it's subscribed to (this is the dashed line in the image above), gets the latest update, parses it and saves the data to a database.

It also notifies the lifestream of new events through websockets.

The Lifestream

Finally, we have the lifestream which is a simple PHP script that gets previously saved events from the MySQL database, and new events through websockets.

Conclusion

I'm planning on adding more sources to the stream as time goes by. Polishing how everything works is also an ongoing activity.

Lifestream Architecture was originally published by Chimo at Chromic on June 15, 2015.

Git, Gogs, Jekyll and Auto-deployment

mercredi 13 mai 2015 à 06:00

Since yesterday, this blog deploys automatically when I push changes to its git repository. Before that, my workflow was something like:

  1. ssh into the server
  2. write something
  3. git add, git commit, git push (or forget about this step altogether)
  4. bundle exec jekyll build

Now, I do:

  1. Write something (from any machine where I have a copy of my blog's git repository)
  2. git add, git commit, git push

The best part for me is that I should now have a proper git history of my blog's modifications instead of a few giant commits spaced far apart in time due to skipping step 3 by mistake or laziness.

This isn't unique, has been done before, and a few automated deployment techniques are documented on the Jekyll site, but I wanted to jot down my configuration for a few reasons:

The Environment

This blog is hosted on a VPS (Linode) that runs a few other products, notably Gogs which I use to "self-host" my git projects.

One of those git projects is the source files for the blog.

Gogs lets you edit git hooks via its web interface under the git repository's "Settings > Git Hooks" section. The one we're interested in is "post-receive", which runs on the server-side after git is done receiving the changes you pushed to it via "git push" from the client-side.

Since both Gogs and the Jekyll blog are on the same server, the only thing I have to do is tell git to run the "jekyll build" command (along with some house-keeping details) after it got the new data:

#!/usr/bin/bash

mkdir -p /tmp/gogs-bundle               # Create directory where bundle will install required tools to build
export BUNDLE_BIN_PATH=/tmp/gogs-bundle # Set env. var used by bundle so it knows where to install stuff

cd /srv/http/chromic.org/public_html    # Change to the directory where the blog's source files are
unset GIT_DIR                           # Have git use $PWD instead of $GIT_DIR
git pull                                # Get latest changes

bundle install                          # Make sure we have all the required gems, etc. installed
bundle exec jekyll build                # Deploy!

And that's pretty much it. Make sure the user running your Gogs instance has access to the blog's files/directories and you should be good to go.