Guide

This guide focuses on the Unity asset WebRTC Video Chat but most also applies to WebRTC Network which is a part of it but is also available separately.

Important folders and files

Please note the structure changed in V0.983.

  • scripts/UnityCallFactory.cs – Main access for Networ/Video/Audio call functionality
  • chatapp – contains the text only ChatApp example using a star topology
  • callapp – contains a fully functional 1-to-1 video chat app
  • examples – contains a few simple examples to learn how to use the API
  • extras – contains experimental features that are not fully supported
  • server.zip – Contains an open source node.js signaling server. More at Server side info / Tutorials
  • Plugins
    • Contains C# libraries that provide the API
    • The subfolders contain wrappers that map the API to platform and processor architecture specific code
    • Never move or rename this folder

Quickstart

Recommended Unity version: 2018 LTS

Project setup for Windows, Mac OS

In general these platforms should work right out of the box.  To avoid problems you should set the flag “Run In Background” in the Player Settings to ensure your network code will keep working even if the application is in the background.

MacOS – special cases

WebGL

Supported browsers: Firefox and browsers based on Chrome / Chromium

The asset will automatically load a java script plugin once accessed. It should run without any specific project configuration. The WebGL plugin is also separately available as open source on github and can also be used for regular webpages without Unity.

If using WebGL make sure to keep the asset up-to-date. Browsers will remove old API calls regularly. This means a new browser update can easily break an old asset version.

Note that Chrome denies access to WebRTC features if the page is opened via “file://” or “http://”. It must be loaded via a secure URL (“https://”) and the signaling server must use secure websockets (“wss://”). Not doing so might result in an uncaught exception “Uncaught TypeError: Cannot read property enumerateDevices of undefined” after a recent update in Chrome.

Special cases

Safari 12.2 and newer might work if the user manually allows video & audio device access & allows autoplay. The browsers autoplay protection blocks received video if the user doesn’t send video. Safari is not yet fully supported.

Edge (before Chromium) – The browser is missing DataChannel support and won’t be able to run the asset.

Android

Build info:
Minimum API Level: 16 (Android 4.1.1) or higher (21 for arm64!)
Architectures: armeabi-v7a, arm64-v8a, x86
JDK required: 8 or 1.8  (JavaVersion.VERSION_1_8 in gradle)
SDK used: 28
NDK used: ndk-r16b
(likely works with different SDK / NDK versions as well)

Project setup

.NET backend: .NET 4 is recommended. (3.5 has a bug that can cause wss connection to fail. Using “ws://” works fine though)
Backend: Mono & IL2CPP
Minify: None (Proguard with default settings is known to remove needed files and break the build!)

The android specific plugin now comes in .aar format and Unity should be able to automatically merge its AndroidManifest.xml with the one used for your final application. Make sure your application requests Camera & Microphone permissions before trying to use these features via the asset. See Unity documentation – Android permissions.

If using the recommended Unity version and the latest asset release the CallApp example should run out of the box. Smaller examples might not support Android to keep them simple!

Special cases

V0.983 and earlier did not support arm64! Do not tick arm64 in Unity with older versions otherwise it will fail to load.

Android specific behaviour and API

The android version of the asset requires you to ensure that all permissions it needs are granted before use.  In some cases multiple plugins or a custom AndroidManifest.xml might interfere with the plugin. To fix issues related to this please read “manual setup” above and check the app permission within the Android settings.

Received audio streams in Android are treated as an incoming call. This means based on the device settings the audio might be unusual quiet (assuming the users hold the device on their ears). To turn on the speaker call UnityCallFactory.Instance.SetLoudspeakerStatus(true). V0.983 and later will switch android into “Communication mode” allowing the audio buttons to control the call volume. You can turn this off via a flag in UnityCallFactory.ANDROID_SET_COMMUNICATION_MODE.

Video devices also behave differently on Android due to their fixed location. An easy way to get the device name of the front facing camera is by using “string GetDefaultVideoDevice()” or by checking a known device via “bool IsFrontFacing(string deviceName)” via UnityCallFactory.Instance. Unlike other platforms Android allows only access to a single video device at a time. External video devices are not supported.

Some android devices show more cameras than actually physically build-in e.g. With a recent update the Samsung Galaxy S9 will show three video devices. Two of them are the same front facing camera – one for selfies and one for group selfies using a wider angle.

For other android specific methods feel free to check out the file AndroidHelper.cs. It contains calls to the android API for several tasks such as ways to increase, decrease the call audio, changing the audio mode, checking permissions or opening the permission view. These are provided to help during development but are not fully supported / tested.

iOS

Build info:
Processor Architectures: arm32 and arm64.
SDK used: iphoneos12.2
Xcode used: 10.2
Platform name: iphoneos

Project setup

Minimum iOS Version: 9.3 (9.0 might work but untested)
Backend: IL2CPP
Devices: iPhone & iPad (both supported)
Target SDK: Device SDK. x86 / x64 simulators are not supported!

Make sure to set a Camera and Microphone Usage Description. If those fields are empty newer iOS versions will crash once a device is accessed. You might also want to set your signing team ID in the Unity build settings to ensure the Xcode build can complete without having to set additional values manually.

Besides the configuration above iOS builds should run without manual build steps. The script IosPostBuild.cs will automatically change some values in your XCode project e.g. by making sure the asset’s framework folder is embedded in your project. If this doesn’t work see Special cases below.

Do not move the asset folder. If you have to then you need to update the paths in IosPostBuild.cs or configure the Xcode project manually otherwise the build will fail!

Please check with Unity and iOS documentation first before requesting support.

Special cases

Unity 2017.2 or below: You still need to set a flag by yourself in xcode: “Project settings -> Build phases -> Copy Frameworks -> set the flag Code Sign On Copy

Crash during start. Either “webrtccsharpwrap not found” or a crash during call to “GetVersion()”: This error can have many different reasons. Please see the iOS section of the FAQ

V0.982 only: Not working on old armv7 devices if downloaded via TestFlight or the AppStore:  Search for the folder webrtccsharpwrap.framework and open the file Info.plist in xcode.  Remove the entry “Required device capabilities” completely

Some unity versions fail to execute the build script correctly if you don’t press “Switch Platform” for iOS before pressing “Build&Run”

If building via Unity batchmode or using Unity Testrunner make sure to trigger “Switch Platform” first via your script. Then restart Unity before building.

Running the examples

ChatApp

The chat app allows a user to create or join a chat room. After entering a room the users can send chat messages to everyone in the room. The user who created the room will act as a server
and relay the messages to all connected users.

To run the example start the scene at: WebRtcNetwork\example\chatscene. The file ChatApp.cs contains most of the code and is fully documented. Use this example to learn how to use the library to send any data across the network.

Note that the example contains two instances of the ChatApp. This is done so you can test it locally on a single machine. Simply open a room with the one app and then join it with the second. You can also run the example on multiple machines inside your LAN or using two different internet connections and open / join rooms.  Internet is always required as long as the default signaling server is used.

CallApp

The scene at WebRtcVideoChat/callapp/callscene contains two instances of the CallApp allowing you to test the video and audio library within a single application or run the same program on two different systems and connect. It supports streaming audio, video and sending text messges to the other side. Simply enter a shared text or password on both sides and press the join button. Both apps will be connected and start streaming audio and video (if available). Note that if testing on a single system that video devices can only be accessed by a single app at the same time. Using audio to connect locally will cause an echo (make sure your speakers aren’t set to high volume).