Gerrit Niezen

Maker of open-source software and hardware.

Feature image

I've been away at a Tidepool offsite, which explains the lack of updates to the blog. Hopefully we'll be back to regularly daily updates from now onward.

I've been struggling to successfully link to the avutil shared library on Windows. I thought it was because of a problem with the settings in my binding.gyp file, but it turns out I somehow didn't build the required routine into the DLL file. I just discovered that there's a command on Windows that you can use to inspect DLL files.

dumpbin /exports avutil.dll

This lists all the symbol entries. I expected av_lzo1x_decode to be in there, and surprisingly it wasn't. Note that to use this command you need to have Visual Studio installed and then open a Developer Command Prompt or add it to your path.

Feature image

As promised, here is the C code for wrapping libavutil's LZO algorithm into a Node.js module:

#include <node_api.h>
#include <napi-macros.h>
#include <stdio.h>
#include <stdlib.h>
#include "lzo.h"

#define BUFFER_SIZE 200

NAPI_METHOD(decompress) {
  NAPI_ARGV(2)
  NAPI_ARGV_BUFFER(in, 0)
  NAPI_ARGV_INT32(length, 1)

  int outlen = BUFFER_SIZE;
  unsigned char *out = malloc(outlen);
  napi_value result;

  int ret = av_lzo1x_decode(out, &outlen, in, &length);
  int size = BUFFER_SIZE - outlen;

  if (ret != 0) {
    napi_throw_error(env, NULL, "Failed to decompress");
  }

  NAPI_STATUS_THROWS(napi_create_buffer_copy(env, size, out, NULL, &result));

  free(out);

  return result;
}

NAPI_INIT() {
  NAPI_EXPORT_FUNCTION(decompress)
}

It makes use of the napi-macros package to keep things simple and readable. Basically we're just reading in two arguments – the encoded data and its length. We then send that to the decode algorithm by calling av_lzo1x_decode, and then copy the result in a Node.js Buffer object that we return. Simple.

Let me know if you see any mistakes – this code hasn't been peer reviewed yet.

#Node.js

Feature image

Yesterday I started wrapping the LZO algorithm inside libavutil into a Node.js library. I'm still working on the C code, but let's have a look at how to test if it's actually working. I needed to generate some LZO-compressed date, so I installed the lzop tool using sudo apt install lzop. Then I generated a compressed file using:

echo "The quick quick quick brown fox" | lzop > y.lzo

I opened the resulting file in a hex editor and found the compressed bit, which looked like this: The quick q)�brown fox. Based on this, I figured that if I skip the first 50 bytes of the file, I get to the actual compressed data. So I wrote the following JS code to test if the decompression algorithm works:

const binding = require('node-gyp-build')(__dirname);
const fs = require('fs');

fs.readFile('y.lzo', (err, data) => {
  if (err) throw err;
  const skip = 50;
  const ret = binding.decompress(data.slice(skip), data.length - skip);
  console.log(ret.toString());
});

module.exports = binding;

So far I can see that the data decompresses successfully, but I'm still having a bit of trouble passing back the data into a Node.js Buffer object.

#Node.js

Feature image

Now that I have built FFmpeg's libavutil as a shared library, I need to wrap the C code into a Node.js module. I've done this previously for libmtp, and will be following a similar approach.

First, we need to define our bindings.gyp file:

{
    "targets": [{
        "target_name": "module",
        "sources": [ "./src/module.c" ],
        "library_dirs": [
          "../lib",
        ],
        "libraries": [
            "-lavutil"
        ],
        "include_dirs": [
          "<!(node -e \"require('napi-macros')\")"
        ]
    }],
}

We have one C source file called module.c stored in ./src, while our shared libraries are stored in ./lib. For some reason, the relative path to define the library directories need to go up one directory in order to work. We also need to define the name of the library we're using, avutil, otherwise we'll get a symbol lookup error.

Note that I'm using the napi-macros set of utility macros that makes using N-API a bit more fun, as well as prebuildify. All that you need in your index.js is the following:

const binding = require('node-gyp-build')(__dirname);
    
module.exports = binding;

I'm still working on the C code, so look out for that tomorrow.

#Node.js

Feature image

After going through a series of non-stick teflon pans during the past decade, I wanted to try out using a cast-iron pan. They supposedly last multiple lifetimes if taken care of, so I wanted to see if I would enjoy using one.

The current state-of-the-art cast-iron pan is the one from Lodge, which is only $17 at Walmart, but costs £50 here in the UK. So I decided to go with the Utopia, which is £20 on Amazon and had some decent reviews. Unfortunately the criticism in the Wirecutter article of the Utopia is true: While it looks very similar to the Lodge, the seasoning is very thin and doesn't actually prevent things from sticking to the pan.

Utopia recommends starting off with some bacon to improve the seasoning, and even though I've made a lot of bacon on the pan this week, the eggs I made this morning still stuck to the pan. I've gone through seasoning the pan again twice, which involves coating it in vegetable oil and baking it in an oven at 180 degrees Celsius for 45 minutes.

Maybe I'm just not patient enough? Maybe I should go through another round of seasoning before I try baking eggs in it again?

#Food

Feature image

Today I added a new section called Now to this website. It's inspired by Derek Sivers's Now page. This is the description of a Now page:

So a website with a link that says “ now ” goes to a page that tells you what this person is focused on at this point in their life. For short, we call it a “now page”.

Feature image

After I abandoned my attempt to transpile a C# library to C yesterday, I started to see if I can get the LGPL version of FFmpeg's libavutil library compiled as a shared library, for which I can then write a JavaScript wrapper. I'll put the instructions here in case I or someone else needs it again in future.

First, get the FFmpeg source files: git clone git@github.com:FFmpeg/FFmpeg.git Make sure you have all the required dependencies. A weird one I didn't know I needed was YASM (sudo apt install yasm) .

To have it compile a shared library, you have to pass the right parameter to configure:

./configure --enable-shared 

You can then just run make. It'll take a while to compile, but if it finishes successfully, you'll find a libavutil.a in the libavutil folder.

It's also possible to cross-compile to a Windows DLL file using Linux. You just need to do:

sudo apt install gcc-mingw-w64
./configure --enable-shared --arch=x86 --target-os=mingw32 --cross-prefix=i686-w64-mingw32-
make clean
make

The make clean is necessary to clear out the object files from your previous build that won't cross-compile.

Feature image

After my previous post I managed to build the compiler myself. It turns out that when Emscripten installs itself, it installs its own copy of clang and puts that in the path. CoreRT falls over when it tries to use v6 instead of v3.9, so the solution was to uninstall Emscripten.

Even then, after installing the compiler and getting all the tests to pass, I was still running into some issues. Basically, when I'm trying to link to the static library I built, it was throwing undefined references to things like CoreRT's RhpNewArray.

That, and the fact that the most recent version of the static library has ballooned to more than 30MB with only a couple of test functions, is leading me to abandon this approach. Maybe I'll come back to this when CoreRT is a bit more mature.

Feature image

I came across some pretty cool open data maps recently. The first one is from M&S (via Spencer Wright's awesome The Prepared newsletter), which shows exactly where in the world their products (food & clothing) are made, as well as where they source their raw materials from:

https://interactivemap.marksandspencer.com

Then there's the Open Infrastructure Map that shows where electricity cables, telecoms towers and undersea cables etc. are located:

https://openinframap.org/#2/0/0

I've noticed that there are air pollution sensors installed by our local city council next to busy roads, and checked What Do They Know to find out if anybody has made a freedom-of-information request to get the data. Turns out they are already publishing it online:

https://airquality.gov.wales/

Feature image

So, it turns out we now need to start sending our apps to Apple for “notarization” if we want them to continue to run on macOS:

Note that in an upcoming release of macOS, Gatekeeper will require Developer ID signed software to be notarized by Apple.

This seems like really bad news for macOS, but given the questionable decisions Apple has made in the name of “security”, for example secure kernel extensions, this is not unexpected. This basically implies that Apple is planning on acting as gatekeeper for any software to run on macOS, making it as restrictive as iOS.

Enter your email to subscribe to updates.