Posts

Showing posts from 2017

Grayscale compression with libx264

In a video analysis software we are using the Up Board based on the Intel Z8350. The primary task is real-time analysis and there is no need for video recording but during testing it would be useful to integrate such feature. Unfortunately, this chipset is not Intel Quick Sync capable meaning that it is not offering hardware video encoding, a feature that is easily found in cheaper boards such as RPi.

The sensor is natively grayscale in Full HD and I was looking for a good compressor capable of working directly in grayscale. I looked into libx264, the fastest compressor around, but it is not natively offering mono input. This means that the solution is to provide a planar YUV with zero-ed U and V. This is clearly a waste of memory transfer and processing.

The h.264 specification supports this case as chroma format 4:0:0 but libx264 is not implementing it. For this reason I have patched it for supporting this capability, updating also the testing tool provided with libx264. The outcom…

OSX Sandboxing for include file interference during build

A while a go I posted about the use of the OSX built-in sandbox for protecting against unwanted file or network accesses.

Sometime I perform Windows cross-compilation by using Docker (e.g. dockercros) or more typically directly using cmake under OSX.

In both cases there is the chance that the cmake scripts of a library are not behaving correctly in the lookup of dependencies and create interference with the system include files (e.g. /usr/include and /usr/local/include). The OSX sandbox comes handy allowing to block the access to one or more directories.

This is the OSX sandboxing script I am using:

(version 1)
(deny default)
(allow sysctl-read)
(allow signal)
(allow process-exec)
(allow process-fork)
(allow mach* sysctl-read)
(allow file-read* (regex "^.*"))
(deny file-read* (regex "^/usr/local/include.*") (regex "^/usr/include.*"))
(allow file-write* (regex (string-append "^/tmp/.*")) (regex (string-append "^" (regex-quote (param "target…

Space-time possibly fractal dimensions in arts and books

Image
I was reading a collection of stories from John Barrow (100 things..., 2014) and among maths and arts stories there was one (#58) about the dimensionality in space and time of artistic production. He argues about the fact that basic arts can be mapped to spatial dimensionality in 1,2 and three dimensions from lines to sculpture. And then it time dimension can be easily added to each of them as SNT.

In this short post I am providing some other examples and variation of this idea and making an example of connection between this spatial dimensionality and use in Machine Learning.


[script] Extract and Modify mp4 frame durations

In a recent data acquisition project we used an embedded board that was producing mp4 videos with a variable video acquisition rate. While the video is nominally at 30 FPS the effective rate is around 29.5 FPS with some jitters.

The general suggestion when processing videos with OpenCV VideoCapture is to save the frame time (CV_CAP_PROP_POS_MSEC) so that each frame is associated with the real time, an important step when video annotations are expressed in time units.

The general ffmpeg way to extract durations is the following:

ffprobe -i INPUT -show_frames -show_entries frame=pkt_pts_time -of csv=p=0

On a decent machine a 30 minutes full-HD mp4 video takes 12min with ffprobe or OpenCV, and this is not acceptable.

Luckly mp4 files are easy to parse and the "stss" atom provides this information in a compact form (RLE encoding). Having not found an online solution I have prepared a Python script that extracts the timings as fast as possible (numpy.fromfile) starting from the m…

JSON bag - bInary multipart JSON response

Image
In a scenario in which I am involved we have an embedded server that answers the status of activities and devices combining structured data (JSON) and medium sided binary data (images). This post and the associated github repository (here) presents a AJAX response that combines JSON data with one or more binary payloads. The objectives are the following: avoid base64 bandwidth and decoding overheadavoid custom encodings such JSONB/MessagePack that are slow in browser decodingreduce number of requests or more complex production logic (if not using HTTP 2.0 Server Push)usable for file serializationgraceful degradation (Content negotiation and JavaScript support)The approach is the following: data encoding, transferred as JavaScript ArrayBuffer, that contains readable header, main JSON and then binary blocks each with a JSON headerprocessing on the client side that patches the JSON replacing stub URIs with the new Blob URIs. server-side generation that can produce both the JSON bag and a …