This guide focuses on the Unity asset WebRTC Video Chat but most also applies to WebRTC Network which is a part of it but is also available separately.

Important folders and files

Please note the structure changed in V0.983.

  • scripts/UnityCallFactory.cs – Main access for Networ/Video/Audio call functionality
  • chatapp – contains the text only ChatApp example using a star topology
  • callapp – contains a fully functional 1-to-1 video chat app
  • examples – contains a few simple examples to learn how to use the API
  • extras – contains experimental features that are not fully supported
  • – Contains an open source node.js signaling server. More at Server side info / Tutorials
  • Plugins
    • Contains C# libraries that provide the API
    • The subfolders contain wrappers that map the API to platform and processor architecture specific code
    • Never move or rename this folder


Project setup for Windows, Mac OS

In general these platforms should work right out of the box.  To avoid problems you should set the flag “Run In Background” in the Player Settings to ensure your network code will keep working even if the application is in the background.

MacOS – Common problems


With WebGL the asset will automatically load a java script plugin once accessed. It should run without any specific project configuration. Note that Chrome denies access to WebRTC features if the page is opened via “file://” or “http://”. It must be loaded via a secure URL (“https://”) and the signaling server must use secure websockets (“wss://”). Not doing so might result in an uncaught exception “Uncaught TypeError: Cannot read property enumerateDevices of undefined” after a recent update in Chrome.


Project setup

Minimum API Level: 4.1 or higher

It is recommended to use the .NET 4 backend. 3.5 mono has a bug that can cause the signaling server connection to fail if secure websockets are used (ws:// works just fine though).

Update Unity 2018.3 and newer:

Depending on your project configuration Unity might use runtime permissions for Android and not automatically request Camera / Micrphone access permission for you. In this case you have to call the following before a call:

if (!UnityEngine.Android.Permission.HasUserAuthorizedPermission(UnityEngine.Android.Permission.Microphone))
if (!UnityEngine.Android.Permission.HasUserAuthorizedPermission(UnityEngine.Android.Permission.Camera))

Starting V0.982 the android plugin should work immediately after import without any special setup.

The android specific plugin now comes in .aar format and Unity should be able to automatically merge its AndroidManifest.xml with the one used for your final application. During the first start of the applications unity should request microphone and camera permissions from the user.

Android specific behaviour and API

The android version of the asset requires you to ensure that all permissions it needs are granted before use.  In some cases multiple plugins or a custom AndroidManifest.xml might interfere with the plugin. To fix issues related to this please read “manual setup” above and check the app permission within the Android settings.

Received audio streams in Android are treated as an incoming call. This means based on the device settings the audio might be unusual quiet (assuming the users hold the device on their ears). To turn on the speaker call UnityCallFactory.Instance.SetLoudspeakerStatus(true). V0.983 and later will switch android into “Communication mode” allowing the audio buttons to control the call volume. You can turn this off via a flag in UnityCallFactory.ANDROID_SET_COMMUNICATION_MODE.

Video devices also behave differently on Android due to their fixed location. An easy way to get the device name of the front facing camera is by using “string GetDefaultVideoDevice()” or by checking a known device via “bool IsFrontFacing(string deviceName)” via UnityCallFactory.Instance. Unlike other platforms Android allows only access to a single video device at a time. External video devices are not supported.

For other android specific methods feel free to check out the file AndroidHelper.cs. It contains calls to the android API for several tasks such as ways to increase, decrease the call audio, changing the audio mode, checking permissions or opening the permission view.


Project setup

Player settings:

  • Minimum iOS Version to 9.3
  • Backend should be IL2CPP
  • Make sure to set Camera and Microphone Usage Description. If those fields are empty newer iOS versions will crash once a device is accessed
  • The simulator is not supported.  iPad and iPhone are supported

Building for iOS:

  • The script IosPostBuild.cs will automatically change some values in your XCode project e.g. making sure the framework folder is embedded in your project
  • only needed for versions below Unity 2017.2: You still need to set a flag by yourself in xcode:
    “Project settings -> Build phases -> Copy Frameworks -> set the flag Code Sign On Copy
  • Depending on your personal setup you might then still need to setup signing / other unity and iOS specific build settings. Please check with Unity and iOS documentation first before requesting support.

Common problems

Crash during start. Either “webrtccsharpwrap not found” or a crash during call to “GetVersion()”: This error can have many different reasons. Please see the iOS section of the FAQ

V0.982 only: Not working on old armv7 devices if downloaded via TestFlight or the AppStore:  Search for the folder webrtccsharpwrap.framework and open the file Info.plist in xcode.  Remove the entry “Required device capabilities” completely

Running the examples


The chat app allows a user to create or join a chat room. After entering a room the users can send chat messages to everyone in the room. The user who created the room will act as a server
and relay the messages to all connected users.

To run the example start the scene at: WebRtcNetwork\example\chatscene. The file ChatApp.cs contains most of the code and is fully documented. Use this example to learn how to use the library to send any data across the network.

Note that the example contains two instances of the ChatApp. This is done so you can test it locally on a single machine. Simply open a room with the one app and then join it with the second. You can also run the example on multiple machines inside your LAN or using two different internet connections and open / join rooms.  Internet is always required as long as the default signaling server is used.


The scene at WebRtcVideoChat/callapp/callscene contains two instances of the CallApp allowing you to test the video and audio library within a single application or run the same program on two different systems and connect. It supports streaming audio, video and sending text messges to the other side. Simply enter a shared text or password on both sides and press the join button. Both apps will be connected and start streaming audio and video (if available). Note that if testing on a single system that video devices can only be accessed by a single app at the same time. Using audio to connect locally will cause an echo (make sure your speakers aren’t set to high volume).