Jeeliz for embedded systems

At a first glance, it may seems strange to use Jeeliz technology for embedded systems. Indeed, its favorite environment is the web browser. But web programming seeps everywhere:

  • in mobile applications with PWA (Progressive Web Apps),
  • in desktop applications with Electron,
  • server-side with Node.js,
  • and even in blockchain based decentralized applications with Lisk for example.

So why embedded systems? In this article we describe the main advantages of our technology compared to the use of a native technology.

A JeelizBox embedded into a 70 inches advertisement display,
and everybody get fun!

Hardware agnostic

The web browser is becoming more and more a kind of virtual machine. It adds a level of abstraction between the application and the operating system. We develop embedded applications as web applications. So we will be able to easily upgrade the hardware or the operating system without having to re-compile or to transcode anything.

The life cycle of an embedded application is often longer, so it is important to bet on a technology stable over time. Otherwise maintenance and evolution may be difficult and expensive. Web technologies take long to specify because the standards organization like the W3C or the Khronos Group have to put a lot of people in agreement. For example, the Khronos Group, who is in charge of the WebGL API, is composed of universities, browser makers, graphic hardware makers and operating system builders. But as soon as the standard is specified and implemented into the main we browsers, it will remain unchanged for years. And evolutions are almost always backward compatible.

We think that even in 20 years, the web pages will still have the same structure based on HTML5. EcmaScript 5, the standard version of JavaScript, will still work and WebGL and WebRTC will still work. So, why not use them for an embedded application?

The overload from the web browser is nothing. JavaScript engines are very fast now. Furthermore, the speed limiting factor is the GPU. With WebGL, we can exploit from 80% to 100% of the GPU computing power. And we use Electron to not embed the whold user interface of a full web browser.

Development lifecycle

The development workflow

We have developed internally an efficient and integrated workflow from the initialization of a neural network to its integration. We can:

  • initialize the neural network using a live coding interface,
  • train it and control the training with a graphical user interface,
  • test the neural network,
  • integrate it to build a JavaScript library, using input acquisition helpers and output stabilization helpers,
  • compress and optimize this library.

With this workflow, we build the libraries released on our Github repository. Now, why not use them in an embedded context to build smart cameras, interactive advertisement displays or virtual mirrors?

Write once, use everywhere

The embedded application may be used on another support. For instance an optical shop may want:

  • an advertisement display aside his showcase where passers-by can try his last sunglasses models. It will be powered by an embedded version of the Jeeliz virtual try-on application,
  • a virtual-tryon module integrated to his website where the client can try glasses models before buying them online,
  • a mobile application with the virtual try-on feature too.

So it is important to be able to reuse the same component for all these devices. The standard we should choose is the one which runs in the most constrained environment. The most constrained environment is the web because:

  • it should run on any device, even the weakest,
  • it should be written with strict standards,
  • it should be secured. So we choose the web standard and we find a way to use it for less constrained environments, including the embedded one.
The problems of intercompatibility between languages back to the beginnings of humanity.
Similarly in computer science, they appeared from the beginning.
Painting: “the Tower of Babel” by Pieter Bruegel the Elder. Source: Wikipedia.

As fast as native

Since our technology rely on deep learning neural networks, it is massively parallelisable and runs considerably faster on GPU. Even if we did native deep learning, it would run faster on GPU. Indeed, a CPU is made to process intricated and sequential tasks, whereas a GPU is designed to run simple tasks in parallel. Most oven, these tasks consist in computing pixel colors but we can also sum synaptic weights or process other computations.

So the speed limiting factor is the GPU. The more powerful the GPU is, the more we can increase the detection rate per second. We can then detect quickly, track accurately or use deeper, heavier but more performant neural network models.

With WebGL, we can get the best of the GPU in the web browser. WebGL is only an interface bound to DirectX, OpenGL or Vulkan depending on the operating system. We can exploit from 80% to 100% of the GPU, like in native applications.

GPU oriented hardware for embedded systems is now available. Nvidia has released the Jetson TX1 in April 2014 and we have successfully embedded our deep learning framework on the Nvidia Jetson TX2 on August 2018. We have released publicly the setup we use to run web application on this amazing hardware and called it JetsonJS. The Github repository of the project is here: github.com/jeeliz/jetsonjs.

Nvidia Jetson TX2 devkit

Jeeliz embedded solutions

The JeelizBox

The goal of the JeelizBox is to run Jeeliz Applications in an embedded context. It is small and power efficient, so it can be embedded everywhere. We even managed put it into a thin advertisement display. Its power consumption is under 50W, so it does not heat a lot. The applications are stored on SD cards which can be hot replaced, like a GameBoy! The JeelizBox has the followed features:

  • Application on SD card which can be hot plugged,
  • Wifi,
  • Websocket server to stream data outside,
  • HDMI full HD display.

The JeelizBox software is based on JetsonJS.

Custom hardware

We can also build custom hardware with JetsonJS. We are currently working on a pupillometer by embedding JeelizPupillometry with JetsonJS. We don’t use the JeelizBox for this purpose because we need very specific hardware:

  • an Infrared light to enlight the eyes for the IR camera,
  • a visible LED light to cause pupil dilatation,
  • a small 5 inches touchscreen display to control the device,
  • for later versions: a battery and a charge control.

More is coming soon about this project, so keep tuned by following us on Twitter, Linkedin or Youtube.