How to integrate FLIR BOSON Thermal Camera to NVIDIA Jetson Modules?

How to integrate FLIR BOSON Thermal Camera to NVIDIA Jetson Modules?

Jetson AGX Xavier | Jetson Nano | Jetson Xavier NX

17 February 2021

1- Using V4l2 plugin in Gstreamer Pipeline 

2- Manupulating Gstreamer capabilities

3- Creating pipeline for FLIR BOSON Thermal Camera 


Hardware: Applied to all Forecr Products

OS: Applied to all Jetpack versions


In this post, we are going to use Flir Boson thermal camera with Jetson Modules. By accomplishing it, gstreamer will be used . The usage of the camera module is same with all the Jetson modules so does not matter which Jetson Module you are using.

As default, gstreamer packages are installed by Jetpack software so there is no need to install these packages from scratch.

Our Flir Boson Camera module full model is " Boson 320, 92° (HFOV) 2,3 mm " . It comes with two different resolutions as 640x512 and 320x256 pixels. You should plug the camera module with USB TypeC cable in to the Jetson device.

The first pipeline should be like below.

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=512,format=I420 ! glimagesink

The properties which are exclamation points "!" are known as capabilities. The short form it "caps". These caps are used basically identify what type of data flows between elements.

For this pipeline "v4l2src" is selected as source plugin. "v4l2src" reads frames from a Video4Linux2 device. When you plugged the camera in to the device , there should be "/video*" under the "/dev" file. The number of video file depends on the how many usb devices plugged in to your Jetson device. The format is set to "I420" and the sink plugin is set to "glimagesink".



After running the gstreamer pipeline, the footages should be seen like below.



The another pipeline we are using can be seen below.

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=320,height=256,format=GRAY16_LE ! videoconvert ! glimagesink

The sensor returns the image data as 14.2 bits wide (referred to as 16bits) but most of the commercial displays are only capable of imaging 8 bits of data. In other words, the video is displayed on a 0-255 scale rather than the full 0.00-16383.75 resolution of the sensor.

This pipeline enables you to get raw data and customize the streaming data.


Controlling Parameters of Flir Boson Cameras via Boson SDK
Firstly, we have to download the Boson Software Development Kit (SDK). Please click here for downloading  SDK. After download it, extract the compressed file. Rename it as "BosonSDK"  and copy the directory somewhere you want to work.
You should install the "python3-pip" package and "pyserial" module.


sudo apt-get install pyhton3-pip

pip3 install pyserial

Change the directory and start the compiling process like below.

cd /home/nvidia/BosonSDK/FSLP_Files

make all

If you get the "-m64 " error , delete the "-m64" parameters from "Makefile" and than start again.

After compilation process completed, copy the "99-flir.rules" file under the "/etc/udev/rules.d" directory. This enables us to authorize all the users are capable to use serial communication.

The file can be found at the end of the this blog post. After copy the file , reboot the system.

sudo cp 99-flir.rules /etc/udev/rules.d/

sudo reboot

Start the python3 command line prompt and run the below commands respectively. You should pay attention "os.chdir" and "manualport" paths. "os.chdir" is the location of SDK folder and "manualport" is camera manual device path. It is better to check them before run the commands.


>>> import os

>>> os.chdir("/home/username")

>>> from BosonSDK import *

>>> myport = pyClient.Initialize(manualport="/dev/ttyACM0") # or manualport="COM7" on Windows

>>> pyClient.bosonRunFFC()

>>> result, serialnumber = pyClient.bosonGetCameraSN()

>>> pyClient.Close(myport)



After these commands you should read the serial number like above.

You can find the zip here.
Thanks for reading.