NOTE: this page describes what is a Jami Plugin and how to install and use them.
Jami plugins
As from September of 2020, Jami team has added plugins as a call and chat feature for Linux, Windows, and Android users. This meaning that now you can personalize your call/chat experience by using one of our available plugins. But that is not all, you can also transform your awesome ideas into a brand new plugin!
To properly setup a plugin you must follow the steps in #How to use it?.
To build a available plugin, please refer to #How to build? instructions.
To create your own plugin, please refer to [Create Plugin](7.1 - Create Plugin) instructions.
How it works?
Jami can be break down to three main components that interact together: Daemon, LRC and clients.
Daemon is the core of Jami, and although it does not interact with users, it is involved in every
command. Therefore, Daemon has a JamiPluginManager
class that among other actions perfoms install/uninstall, load/unload, edit preferences and control plugins’ usage.
Despite Daemon importance, what a plugin effectivelly does to a call video/audio or to a chat message is unknown to it the same way Daemon does not know what is effectivelly done by LRC or the clients interfaces.
Plugins then can be seen as a forth interacting component in Jami.
The plugin system inside Jami exposes different APIs that can be used by the plugins. For instance, the ChatHandler and the Media Handler API. The latter enables the plugins to modify audio and video streams from Jami calls and is used by our GreenScreen plugin but could also be used to build a YouTube streaming system, various instagram-style filters, a real time translation service, etc.
Plugins can be composed by one or multiple media and chat handlers that are responsible for attaching/detaching a data stream from Jami and a data process. Each handler represents a functionality that can be totally different between them or can be a modified versions of the same core process. In our example, we have only one functionality, it being, the GreenScreen plugin has one media handler which data process is responsible for segmenting the foreground from a video frame and applying another image to the background, just like it is done with the green screens in movies!
To use one custom functionality, it is necessary that Jami knows all plugins’ handlers, which one is going to be used and the data that should be processed. Plugin’s handlers are created once a plugin is loaded and are shared with Daemon’s Plugin Manager. The data is inside Jami flow (for a call plugin, in the event of a new call, Jami creates and stores the corresponding media stream subjects). Finally, once a user puts a plugin functionality in action Jami tells this handler to attach the available data. When deactivated, Jami tells the handler to dettach.
How to use it?
Setup
A Jami plugin is a pluginname.jpl
file, and it must be installed to your Jami.
Once installed, Jami will add your new plugin to the available plugins list but they will not be ready for use yet. Plugins are libraries and must be loaded if you want to expose them.
Moreover, plugin may have preferences and besides install/uninstall and load/unload actions, it is possible to modify those preferences. For example, our GreenScreen plugin allows the user to change the background image displayed.
Android
To setup a plugin for Android you must go under Setting, enable plugins if they are disabled, and select a plugin file from your phone. After installed it is automaticaly loaded. Optionally, you can manually perform load/unload using the checkbox button on the plugin list.
For Android uninstall, you must click on the plugin and a uninstall option will appear allong with the preferences and a reset preferences option. In order to a preference modification can take effect the plugin has to be reloaded.
Linux/Windows
Similarly, for the client-qt available on Linux and Windows and for the client-gnome available only on Linux, you must go to Prefences, enable plugins if it is disabled, and select a plugins file from your computer. Each plugin in the shown list is linked to two buttons beeing:
Client-qt: a load/unload button and a preferences button;
Client-gnome: a load/unload button and a uninstall button; For client-gnome it is not possible to change plugin’s preferences.
Use!
A media handler functionality only takes place if you turn them on during a call. For either Android, Linux or Windows you can do so by clicking on the plugins icon on your call screen.
Similarly, for chat handler functionality, you will see a plugin icon in the chat window as in the images bellow.
How to build?
If you want to make something with your video call, it is possible that you will do so with OpenCV and/or deep learning models (Tensorflow, PyTorch, etc). So, before going to the plugin, it is necessary to build plugin’s dependencies.
Dependencies
Here we give you the steps to build OpenCV and ONNX but do not feel limited to these libraries. We offer a [page](7.2 - Tensorflow Plugin) with detailled explanation of how to build tensorflow C++ API for Windows, Linux and Android. Other libraries should work as long they and the plugin are correctly built!
OpenCV 4.1.1
We kindly added OpenCV 4.1.1 as a contrib in daemon. This way you can easily build OpenCV for Android, Linux, and Windows. You only have to follow the corresponding instructions.
Windows
set DAEMON=<path/to/daemon>
cd ${DAEMON}/compat/msvc
python3 winmake.py -fb opencv
Linux
With Docker (recommended):
export DAEMON=<path/to/daemon>
cd ${DAEMON}/../
docker build -f plugins/docker/Dockerfile_ubuntu_18.04_onnxruntime -t plugins-linux .
docker run --rm -it -v ${DAEMON}/../:/home/plugins/jami:rw plugins-linux:latest /bin/bash
cd jami/plugins/contrib
cd ../../daemon/contrib
mkdir native
cd native
../bootstrap --disable-argon2 --disable-asio --disable-fmt --disable-gcrypt --disable-gmp --disable-gnutls --disable-gpg-error --disable-gsm --disable-http_parser --disable-iconv --disable-jack --disable-jsoncpp --disable-libarchive --disable-libressl --disable-msgpack --disable-natpmp --disable-nettle --enable-opencv --disable-opendht --disable-pjproject --disable-portaudio --disable-restinio --disable-secp256k1 --disable-speexdsp --disable-upnp --disable-uuid --disable-yaml-cpp --disable-zlib
make list
make fetch opencv opencv_contrib
make
Using your own system:
export DAEMON=<path/to/daemon>
cd ${DAEMON}/contrib/native
../bootstrap --enable-ffmpeg --disable-argon2 --disable-asio --disable-fmt --disable-gcrypt --disable-gmp --disable-gnutls --disable-gpg-error --disable-gsm --disable-http_parser --disable-iconv --disable-jack --disable-jsoncpp --disable-libarchive --disable-libressl --disable-msgpack --disable-natpmp --disable-nettle --enable-opencv --disable-opendht --disable-pjproject --disable-portaudio --disable-restinio --disable-secp256k1 --disable-speexdsp --disable-upnp --disable-uuid --disable-yaml-cpp --disable-zlib
make list
make fetch opencv opencv_contrib
make
Android
Using Docker (recommended):
export DAEMON=<path/to/daemon>
cd ${DAEMON}/../
docker build -f plugins/docker/Dockerfile_android_onnxruntime -t plugins-android .
docker run --rm -it ${DAEMON}/:/home/gradle/src:rw plugins-android:latest /bin/bash
cd plugins/contrib
ANDROID_ABI="arm64-v8a" sh build-dependencies.sh
Using your own system:
export DAEMON=<path/to/daemon>
cd ${DAEMON}
export ANDROID_NDK=<NDK>
export ANDROID_ABI=arm64-v8a
export ANDROID_API=29
export TOOLCHAIN=$ANDROID_NDK/toolchains/llvm/prebuilt/linux-x86_64
export TARGET=aarch64-linux-android
export CC=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang
export CXX=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang++
export AR=$TOOLCHAIN/bin/$TARGET-ar
export LD=$TOOLCHAIN/bin/$TARGET-ld
export RANLIB=$TOOLCHAIN/bin/$TARGET-ranlib
export STRIP=$TOOLCHAIN/bin/$TARGET-strip
export PATH=$PATH:$TOOLCHAIN/bin
cd contrib
mkdir native-${TARGET}
cd native-${TARGET}
../bootstrap --build=x86_64-pc-linux-gnu --host=$TARGET$ANDROID_API --enable-opencv --enable-opencv_contrib
make
Onnxruntime 1.6.0
A difficulty for a lot of people working with deep learning models is how to deploy them. With that in mind we provide the user the possibility of using onnxruntime. There are several development libraries to train and test but, they are usually too heavy to deploy. Tensorflow with cuda support, for instance, can easily surpass 400MB. In our GreenScreen plugin We chose to use onnxruntime because it’s lighter (library size of 140Mb for cuda support) and supports model convertion from several development libraries (Tensorflow, PyTorch, Caffe, etc.).
For more advanced and curious third-party developpers, we also [provide instructions](7.2 - Tensorflow Plugin) to build Tensorflow C++ API for Windows and Linux, and the TensorflowLite C++ API for Android.
To build onnxruntime based plugins for Linux and Android, we strongly recommend using docker files available under <plugins>/docker/
. We don’t offer Windows docker, but here we carefully guide you through the proper build of this library for our three supported platforms.
If you want to build onnxruntime with Nvidia GPU suport, be sure to have a CUDA capable GPU and that you have followed all installation steps for the Nvidia drivers, CUDA Toolkit, CUDNN, and that their versions match.
The following links may be very helpfull:
https://developer.nvidia.com/cuda-gpus
https://developer.nvidia.com/cuda-toolkit-archive
https://developer.nvidia.com/cudnn
Linux and Android
We kindly added onnxruntime as a contrib in daemon. This way you can easily build onnxruntime for Android, and Linux.
Linux - Without acceleration:
export DAEMON=<path/to/daemon>
cd ${DAEMON}/contrib/native
../bootstrap
make .onnx
Linux - With CUDA acceleration (CUDA 10.2):
export CUDA_PATH=/usr/local/cuda/
export CUDA_HOME=${CUDA_PATH}
export CUDNN_PATH=/usr/lib/x86_64-linux-gnu/
export CUDNN_HOME=${CUDNN_PATH}
export CUDA_VERSION=10.2
export USE_NVIDIA=True
export DAEMON=<path/to/daemon>
cd ${DAEMON}/contrib/native
../bootstrap
make .onnx
Android - With NNAPI acceleration:
export DAEMON=<path/to/daemon>
cd ${DAEMON}
export ANDROID_NDK=<NDK>
export ANDROID_ABI=arm64-v8a
export ANDROID_API=29
export TOOLCHAIN=$ANDROID_NDK/toolchains/llvm/prebuilt/linux-x86_64
export TARGET=aarch64-linux-android
export CC=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang
export CXX=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang++
export AR=$TOOLCHAIN/bin/$TARGET-ar
export LD=$TOOLCHAIN/bin/$TARGET-ld
export RANLIB=$TOOLCHAIN/bin/$TARGET-ranlib
export STRIP=$TOOLCHAIN/bin/$TARGET-strip
export PATH=$PATH:$TOOLCHAIN/bin
cd contrib
mkdir native-${TARGET}
cd native-${TARGET}
../bootstrap --build=x86_64-pc-linux-gnu --host=$TARGET$ANDROID_API
make .onnx
Windows
Pre-build:
mkdir pluginsEnv
export PLUGIN_ENV=<full-path/pluginsEnv>
cd pluginsEnv
mkdir onnxruntime
mkdir onnxruntime/cpu
mkdir onnxruntime/nvidia-gpu
mkdir onnxruntime/include
git clone https://github.com/microsoft/onnxruntime.git onnx
cd onnx
git checkout v1.6.0 && git checkout -b v1.6.0
Without acceleration:
.\build.bat --config Release --build_shared_lib --parallel --cmake_generator "Visual Studio 16 2019"
cp ./build/Windows/Release/Release/onnxruntime.dll ../onnxruntime/cpu/onnxruntime.dll
With CUDA acceleration (CUDA 10.2):
.\build.bat --config Release --build_shared_lib --parallel --cmake_generator "Visual Studio 16 2019"
--use_cuda --cudnn_home <cudnn home path> --cuda_home <cuda home path> --cuda_version 10.2
cp ./build/Windows/Release/Release/onnxruntime.dll ../onnxruntime/nvidia-gpu/onnxruntime.dll
Post-build:
cp -r ./include/onnxruntime/core/ ../onnxruntime/include/
For further build instructions, please refer to onnxruntime official GitHub.
Plugin
To exemplify a plugin build, we will use the GreenScreen plugin available here.
Linux/Android
First you need to go to the plugins repository in your cloned ring-project:
cd <ring-project>/plugins
Linux - Nvidia GPU
PROCESSOR=NVIDIA python3 build-plugin.py --projects=GreenScreen
Linux - CPU
python3 build-plugin.py --projects=GreenScreen
Android
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64/jre
export ANDROID_HOME=/home/${USER}/Android/Sdk
export ANDROID_SDK=${ANDROID_HOME}
export ANDROID_NDK=${ANDROID_HOME}/ndk/21.1.6352462
export ANDROID_NDK_ROOT=${ANDROID_NDK}
export PATH=${PATH}:${ANDROID_HOME}/tools:${ANDROID_HOME}/platform-tools:${ANDROID_NDK}:${JAVA_HOME}/bin
ANDROID_ABI="arm64-v8a armeabi-v7a x86_64" python3 build-plugin.py --projects=GreenScreen --distribution=android
The GreenScreen.jpl file will be available under <ring-project/plugins/build/>
.
Windows
Windows build of plugins are linked with the daemon repository and its build scripts. So to build our example plugins you have to:
cd <ring-project>/daemon/compat/msvc
python3 winmake.py -fb GreenScreen
The GreenScreen.jpl file will be available under <ring-project/plugins/build/>
.
Related articles: