Gerrit Niezen

Node

This is a follow-on to Part 1 and Part 2 of my adventures in connecting to Android devices over MTP using Node.js, on Windows.

While compiling the Node.js native module, I got the following error:

Macro definition of snprintf conflicts with Standard Library function declaration

Searching StackOverflow led to this solution, which I added to the libmtp source code:

#if _MSC_VER < 1900
#define snprintf _snprintf
#endif

I also kept on getting a “Module not found” error, until I re-read my own post and used Dependency Walker to figure out that for some reason my .node file is looking for libmtp-9.dll.dll instead of libmtp-9.dll. 🤷‍♂️️ I also had to copy libusb-1.0.dll into the same folder, as it was looking for that too.

Finally, like magic, I was able to connect to a device on Windows over MTP with my own Node.js library using libmtp, instead of the Windows MTP implementation that can only be accessed through Windows Explorer or the Windows APIs.

And I just submitted a PR to get Windows builds fixed in the upstream libmtp library.

Next step: Getting it compiled for 32-bit Windows using i686-w64-mingw32 and/or i686-mingw32


I’m publishing this as part of 100 Days To Offload. You can join in yourself by visiting https://100daystooffload.com.

#100DaysToOffload #day34 #libmtp #Node.js

After my promising start last week in reading data from a blood glucose meter using the IEEE 11073 standard, I've come across my first hurdle. I can read the Medical Device System (MDS) attributes just fine, as well as details of the PM (Persistent Metric) store. Basically, when data is stored on the device to be retrieved at a later date, it is placed in the PM store. I'm now trying to read data form the PM store, but it's not going as planned.

It took me a while to realize that I should be reading from the PM store's segment ID 0 instead of 1. You know, the classic off-by-one issue. Now it's telling me that the meter data is being sent, but I'm not receiving it as expected. My current guess is that the issue may be with WebUSB itself after looking at this Stack Overflow thread. Basically I'm thinking that I should use an event listener to read data from a bulk endpoint, rather than polling it as I'm doing at the moment. The thing is, I'm not sure if WebUSB supports using event listeners for reading data, so now I'm reimplementing everything in regular node-usb.

#Node.js

Yesterday I mentioned having Windows permission issues with a blood glucose meter that mounts as a mass storage device. After reading that someone else experienced the exact same issue while building a disk image write for Windows using Node.js/Electron, I don't feel so alone anymore.

I think I finally figured out what's going on. To make a long story short, Windows allows you to open a physical drive (e.g. \\.\PhysicalDrive1) with an exclusive lock even if there's a mounted volume, but then doesn't permit you to write to it. If you try to open using the logical name or mounted volume (e.g. \\.\E:) with an exclusive lock flag (O_EXLOCK), it fails with the error “Resource busy or locked”, but if you use FSCTL_LOCK_VOLUME to lock the volume after you opened it, it works!

It looks like one reason for this confusion is because of how Windows handles removable media differently, based on this comment from Jonas Hermsmeier:

Just remembered that the \\.\PhysicalDriveN only works for (things that act like) hard drives, not removable media. If something acts like a removable disk (or floppy, CD-ROM, etc.), \\.\X: opens the raw drive. This is due to the SD Card reader (for example) being the actual physical device, I believe. So it depends heavily on the device type.

#Node.js

I'm having some trouble reading data from a Verio Flex glucose meter on Windows. It seems to be some kind of permission issue, because it works just fine on macOS. On Linux we're accessing the device directly as a USB device and communicate with it using SCSI commands. This won't work on Windows, however, as it means installing a WinUSB driver for the device, which will break software that depends on the meter to be available as a mass storage device.

On Windows, the Verio & Verio Flex meters mount as mass storage devices, like a USB drive. However, you don't actually see the data as a file if you look at it in File Explorer. You have to set special permissions when opening the device and write data to it as commands in order to receive data. When I try sending commands to the device, I get a EPERM: operation not permitted, write error. It doesn't matter how I set the permissions, I just can't get it to succeed.

Strangely enough, using the same permissions, I also cannot write data to a regular USB drive. It's possible that there's just something wrong with the permissions on my computer, but I've tested it on two other machines too. It's also possible that Node.js is just not capable of setting the permissions correctly on Windows, but I would like to figure out why that is.

#Node.js

After finally getting the LZO decompression module working on 32-bit Windows, there was one more issue that popped up resulting in a malloc: Incorrect checksum for freed object error.

First I thought there was something wrong with my macOS version of the avutil shared library, as the error was occurring on macOS but not in Linux. After some debugging I discovered that it didn't occur every single time the code was run, indicating that it's some kind of buffer overflow that was occurring instead.

When increasing the size of the output buffer by one, ie., malloc(outputBufferSize + 1), the error stopped happening. Then my colleague Lennart discovered that the LZO code in libavutil requires the buffers to be padded by a certain number of bytes. I added the AV_LZO_INPUT_PADDING and AV_LZO_OUTPUT_PADDING constants in the header file to the buffer sizes, and so far everything finally seems to work without any problems.

#Node.js

Feature image

I struggled for two days straight to get my LZO Node module working on Windows. It kept on complaining about “A dynamic link library (DLL) initialization routine failed.” when trying to use it in Electron on Windows. It turns out that this is due to a bug in Electron and Node. The funny thing is that there is a workaround implemented in the prebuildify dependency I'm using, but I was just not using the latest version. Always check your dependency versions kids!

There was one more twist: It still wouldn't work on 32-bit Windows, resulting in a “Module not found” error when it's clearly in the right directory. It seems to have something to do with the avutil-56.dll I built, as using a prebuilt .dll works fine. I do need to build my own, as prebuilt ones are usually not LGPL licensed. So now I'm rebuilding FFmpeg for the umpteenth time, this time round using the ffmpeg-windows-build-helpers script to build LGPL-licensed 32-bit and 64-bit cross-compiled shared libraries:

./cross_compile_ffmpeg.sh --enable-gpl=n --build-ffmpeg-shared=y --compiler-flavors=multi

#Node.js

Feature image

I found the reason why I didn't get av_lzo1x_decode compiled into the avutil library when cross-compiling to Windows from Linux the first time (as described in yesterday's blog post). Basically I needed to do sudo apt-get install mingw-w64-tools first, which adds a tool that ensures the necessary library functions get compiled in. Who knew?

Because the flag to use run-time references is enabled by default, you need to recreate the import library from the .dll file using the .def file:

lib /machine:x64 /def:avutil-56.def /out:avutil.lib

Note that if you're using a avutil-56.def file, your executable will be looking for a avutil-56.dll, not avutil.dll as expected. I had to use Dependency Walker to figure this out. And then I discovered that I only built the 32-bit DLL. To build for 64-bit, use:

./configure --enable-shared --arch=x86_64 --target-os=mingw32 --cross-prefix=x86_64-w64-mingw32-

By the way, if you want to check if a DLL file is 32-bit or 64-bit, you can use

dumpbin /headers avutil-56.dll | findstr machine

To use the right DLL with the right architecture, I used the following snippet based on headless-gl's binding.gyp:

'copies': [
  {
    'destination': '$(SolutionDir)$(ConfigurationName)',
    'files': [
      '<(module_root_dir)/lib/<(target_arch)/avutil-56.dll'
    ]
  }
]

This will copy the DLL from either lib/x64 or lib/x86 and put it in the same folder as module.node. Nice!

#Node.js

Feature image

As promised, here is the C code for wrapping libavutil's LZO algorithm into a Node.js module:

#include <node_api.h>
#include <napi-macros.h>
#include <stdio.h>
#include <stdlib.h>
#include "lzo.h"

#define BUFFER_SIZE 200

NAPI_METHOD(decompress) {
  NAPI_ARGV(2)
  NAPI_ARGV_BUFFER(in, 0)
  NAPI_ARGV_INT32(length, 1)

  int outlen = BUFFER_SIZE;
  unsigned char *out = malloc(outlen);
  napi_value result;

  int ret = av_lzo1x_decode(out, &outlen, in, &length);
  int size = BUFFER_SIZE - outlen;

  if (ret != 0) {
    napi_throw_error(env, NULL, "Failed to decompress");
  }

  NAPI_STATUS_THROWS(napi_create_buffer_copy(env, size, out, NULL, &result));

  free(out);

  return result;
}

NAPI_INIT() {
  NAPI_EXPORT_FUNCTION(decompress)
}

It makes use of the napi-macros package to keep things simple and readable. Basically we're just reading in two arguments – the encoded data and its length. We then send that to the decode algorithm by calling av_lzo1x_decode, and then copy the result in a Node.js Buffer object that we return. Simple.

Let me know if you see any mistakes – this code hasn't been peer reviewed yet.

#Node.js

Feature image

Yesterday I started wrapping the LZO algorithm inside libavutil into a Node.js library. I'm still working on the C code, but let's have a look at how to test if it's actually working. I needed to generate some LZO-compressed date, so I installed the lzop tool using sudo apt install lzop. Then I generated a compressed file using:

echo "The quick quick quick brown fox" | lzop > y.lzo

I opened the resulting file in a hex editor and found the compressed bit, which looked like this: The quick q)�brown fox. Based on this, I figured that if I skip the first 50 bytes of the file, I get to the actual compressed data. So I wrote the following JS code to test if the decompression algorithm works:

const binding = require('node-gyp-build')(__dirname);
const fs = require('fs');

fs.readFile('y.lzo', (err, data) => {
  if (err) throw err;
  const skip = 50;
  const ret = binding.decompress(data.slice(skip), data.length - skip);
  console.log(ret.toString());
});

module.exports = binding;

So far I can see that the data decompresses successfully, but I'm still having a bit of trouble passing back the data into a Node.js Buffer object.

#Node.js

Feature image

Now that I have built FFmpeg's libavutil as a shared library, I need to wrap the C code into a Node.js module. I've done this previously for libmtp, and will be following a similar approach.

First, we need to define our bindings.gyp file:

{
    "targets": [{
        "target_name": "module",
        "sources": [ "./src/module.c" ],
        "library_dirs": [
          "../lib",
        ],
        "libraries": [
            "-lavutil"
        ],
        "include_dirs": [
          "<!(node -e \"require('napi-macros')\")"
        ]
    }],
}

We have one C source file called module.c stored in ./src, while our shared libraries are stored in ./lib. For some reason, the relative path to define the library directories need to go up one directory in order to work. We also need to define the name of the library we're using, avutil, otherwise we'll get a symbol lookup error.

Note that I'm using the napi-macros set of utility macros that makes using N-API a bit more fun, as well as prebuildify. All that you need in your index.js is the following:

const binding = require('node-gyp-build')(__dirname);
    
module.exports = binding;

I'm still working on the C code, so look out for that tomorrow.

#Node.js