Open In App

Video Call Application in Firebase in Kotlin

Last Updated : 16 Aug, 2024
Summarize
Comments
Improve
Suggest changes
Share
Like Article
Like
Report

Developing an application for video calling using Kotlin with Firebase is one of the fascinating projects to develop which combines real-time communication application, cloud services, as well as mobile application development. As you will learn from this article, the following features are of particular significance when it comes to constructing such an application. Authentication will be facilitated by Firebase, Firestore database will be implemented as the our database and WebRTC will be used as the basis for the real-time call.

Prerequisites

Before we begin, make sure you have:

  1. A Firebase project is set up in the Firebase console.
  2. Android Studio installed.
  3. Basic knowledge of Kotlin and Android development.

1. Setting Up Firebase

Firebase requires creating of a project in the Firebase Console and incorporating into the Android application. This step comprises of downloading the Google-services and placing them as required. JSON file present in your project app directory which helps the firebase to recognize as well as authenticate your app. You also need to add Firebase dependencies to your project’s settings Gradle files by which you are going to use authentication, Firestore and storage from Firebase. This basic structure is essential for making use of the Firebase back-end services.

1.1. Create a Firebase Project

  1. Go to the Firebase Console.
  2. Click on "Add Project" and follow the instructions to create a new project.

1.2. Add Firebase to Your Android App

  1. In the Firebase Console, click on "Add app" and select Android.
  2. Register your app with your package name (e.g., com.example.videocall).
  3. Download the google-services.json file and place it in the app directory of your Android project.

1.3. Add Firebase Dependencies

Open your build.gradle files and add the necessary Firebase dependencies.

Project-level build.gradle:

buildscript {
dependencies {
// ...
classpath 'com.google.gms:google-services:4.3.10'
}
}

App-level build.gradle:

plugins {
id 'com.android.application'
id 'com.google.gms.google-services'
}
android {
// ...
}
dependencies {
// Firebase dependencies
implementation 'com.google.firebase:firebase-auth:21.0.1'
implementation 'com.google.firebase:firebase-firestore:24.0.0'
implementation 'com.google.firebase:firebase-storage:20.0.0'
}

2. Implementing Authentication

In this section, we focus on setting up Firebase Authentication, which manages user sign-ins securely. We use anonymous authentication as a simple way to authenticate users without requiring explicit login credentials. This involves initializing FirebaseAuth in your application and signing users in anonymously. This authentication step is critical as it enables secure access to Firebase services and allows you to manage user sessions, which is particularly important for features like personalized experiences and user-specific data management.

2.1. Initialize Firebase Auth

In your MainActivity.kt:

Kotlin
import com.google.firebase.auth.FirebaseAuth

class MainActivity : AppCompatActivity() {
    private lateinit var auth: FirebaseAuth

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        
        auth = FirebaseAuth.getInstance()
    }
}

2.2. Sign in Anonymously

Add the following code to sign in users anonymously:

Kotlin
auth.signInAnonymously().addOnCompleteListener(this) { task ->
    if (task.isSuccessful) {
        // Sign-in successful
        val user = auth.currentUser
        // Proceed to video call setup
    } else {
        // Sign-in failed
        Log.w("MainActivity", "signInAnonymously:failure", task.exception)
    }
}

3. Setting Up Firestore

Firestore is the mobile, web, and server application development database created by Firebase and Google Cloud. In this part of the lesson, we configure Firestore for saving and synchronizing data in real-time between the users. We also specify a certain data model that will be used in order to organize information related to the video call sessions, such as the caller and receiver IDs and SDP offers and answers to them. This database structure is quite useful to keep the signaling data of the calls and facilitate the interaction between the users, during a video call session.

3.1. Define Firestore Structure

Create a calls collection in Firestore where each document represents a call session:

calls (collection)
|
└── callId (document)
├── callerId (string)
├── receiverId (string)
├── offer (map)
└── answer (map)

3.2. Firestore Integration

Initialize Firestore in your app:

Kotlin
import com.google.firebase.firestore.FirebaseFirestore

class MainActivity : AppCompatActivity() {
    private lateinit var db: FirebaseFirestore

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        
        db = FirebaseFirestore.getInstance()
    }
}

4. Implementing WebRTC for Video Calling

WebRTC (Web Real-Time Communication) is the core technology for enabling real-time video and audio communication in the app. This section involves adding WebRTC dependencies and setting up the necessary components for video calling. We create a WebRTCClient class to manage peer connections, local and remote media streams, and the exchange of SDP messages. WebRTC's integration is critical for providing the core functionality of the video call, including capturing video, streaming media, and handling network negotiations.

Capture
webRTC works


4.1. Adding WebRTC Dependencies

Add the WebRTC dependencies to your build.gradle:

dependencies {
implementation 'org.webrtc:google-webrtc:1.0.+'
}

4.2. WebRTC Setup

Create a class WebRTCClient to manage the WebRTC setup:

Kotlin
import org.webrtc.*

class WebRTCClient(
    private val context: Context,
    private val onRemoteStreamReceived: (MediaStream) -> Unit
) {
    private val peerConnectionFactory: PeerConnectionFactory
    private var localStream: MediaStream? = null
    private var remoteStream: MediaStream? = null
    private var peerConnection: PeerConnection? = null

    init {
        PeerConnectionFactory.initialize(
            PeerConnectionFactory.InitializationOptions.builder(context).createInitializationOptions()
        )
        peerConnectionFactory = PeerConnectionFactory.builder().createPeerConnectionFactory()
    }

    fun initializeLocalStream() {
        val videoCapturer = createCameraCapturer(Camera1Enumerator(false))
        val videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast)
        val videoTrack = peerConnectionFactory.createVideoTrack("videoTrack", videoSource)
        videoCapturer.startCapture(1024, 720, 30)

        val audioSource = peerConnectionFactory.createAudioSource(MediaConstraints())
        val audioTrack = peerConnectionFactory.createAudioTrack("audioTrack", audioSource)

        localStream = peerConnectionFactory.createLocalMediaStream("localStream").apply {
            addTrack(videoTrack)
            addTrack(audioTrack)
        }
    }

    fun startCall(remoteUserId: String) {
        val iceServers = listOf(
            PeerConnection.IceServer.builder("stun:stun.l.google.com:19302").createIceServer()
        )
        peerConnection = peerConnectionFactory.createPeerConnection(iceServers, object : PeerConnection.Observer {
            override fun onIceCandidate(candidate: IceCandidate?) {
                // Send candidate to remote user via Firestore
            }

            override fun onAddStream(stream: MediaStream?) {
                remoteStream = stream
                onRemoteStreamReceived(remoteStream!!)
            }

            override fun onDataChannel(dataChannel: DataChannel?) {}
            override fun onIceConnectionReceivingChange(p0: Boolean) {}
            override fun onIceConnectionChange(p0: PeerConnection.IceConnectionState?) {}
            override fun onIceGatheringChange(p0: PeerConnection.IceGatheringState?) {}
            override fun onSignalingChange(p0: PeerConnection.SignalingState?) {}
            override fun onIceCandidatesRemoved(p0: Array<out IceCandidate>?) {}
            override fun onRemoveStream(p0: MediaStream?) {}
            override fun onRenegotiationNeeded() {}
        })

        localStream?.let {
            peerConnection?.addStream(it)
        }

        // Create offer
        val sdpObserver = object : SdpObserver {
            override fun onCreateSuccess(sdp: SessionDescription?) {
                peerConnection?.setLocalDescription(this, sdp)
                // Send offer to remote user via Firestore
            }

            override fun onSetSuccess() {}
            override fun onCreateFailure(p0: String?) {}
            override fun onSetFailure(p0: String?) {}
        }

        peerConnection?.createOffer(sdpObserver, MediaConstraints())
    }

    private fun createCameraCapturer(enumerator: CameraEnumerator): VideoCapturer? {
        for (deviceName in enumerator.deviceNames) {
            if (enumerator.isFrontFacing(deviceName)) {
                return enumerator.createCapturer(deviceName, null)
            }
        }
        return null
    }
}


4.3. Handling Calls

In your MainActivity, manage the call flow:

Kotlin
class MainActivity : AppCompatActivity() {
    private lateinit var webRTCClient: WebRTCClient

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        webRTCClient = WebRTCClient(this) { remoteStream ->
            // Display remote stream
        }

        webRTCClient.initializeLocalStream()

        // To start a call
        val remoteUserId = "user_id_to_call"
        webRTCClient.startCall(remoteUserId)
    }
}


5. Handling Call Signaling

Out of the WebRTC fundamentals, call signaling enables the creation of a reference connection between two or more peers in the application. This section describes how to achieve the exchange of SDP messages and ICE candidates using Firestore. These exchanges are vital for establishing the peer-to-peer relationships, for deciding on the type of media, and for controlling the networks’ routing. In this paper, the numerous considerations relevant to call signaling have been outlined to show that, given proper handling, users will be able to connect seamlessly while the channel is optimally managed for the best possible call quality.

5.1. Exchanging SDP

Use Firestore to exchange SDP messages

Kotlin
fun startCall(remoteUserId: String) {
    val callRef = db.collection("calls").document()
    val callerId = auth.currentUser?.uid

    peerConnection?.createOffer(object : SdpObserver {
        override fun onCreateSuccess(sdp: SessionDescription?) {
            peerConnection?.setLocalDescription(this, sdp)
            val offer = mapOf(
                "type" to sdp?.type.toString(),
                "sdp" to sdp?.description
            )
            callRef.set(mapOf("callerId" to callerId, "receiverId" to remoteUserId, "offer" to offer))
        }

        override fun onSetSuccess() {}
        override fun onCreateFailure(error: String?) {}
        override fun onSetFailure(error: String?) {}
    }, MediaConstraints())

    // Listen for answer
    callRef.addSnapshotListener { snapshot, e ->
        if (snapshot?.exists() == true) {
            val answer = snapshot.get("answer") as Map<String, String>?
            answer?.let {
                val sessionDescription = SessionDescription(
                    SessionDescription.Type.fromCanonicalForm(it["type"]!!),
                    it["sdp"]
                )
                peerConnection?.setRemoteDescription(object : SdpObserver {
                    override fun onCreateSuccess(p0: SessionDescription?) {}
                    override fun onSetSuccess() {}
                    override fun onCreateFailure(p0: String?) {}
                    override fun onSetFailure(p0: String?) {}
                }, sessionDescription)
            }
        }
    }
}


5.2. Exchanging ICE Candidates

Exchange ICE candidates similarly:

Kotlin
peerConnection?.addIceCandidate(iceCandidate)

callRef.addSnapshotListener { snapshot, e ->
    if (snapshot?.exists() == true) {
        val candidateMap = snapshot.get("candidates") as List<Map<String, String>>?
        candidateMap?.forEach {
            val candidate = IceCandidate(
                it["sdpMid"], 
                it["sdpMLineIndex"]!!.toInt(), 
                it["sdp"]
            )
            peerConnection?.addIceCandidate(candidate)
        }
    }
}

6. UI and Permissions

The User Interface (UI) and permissions setup are critical for user experience and functionality. This section covers requesting necessary permissions like camera and microphone access, which are essential for video calling. It also involves designing the UI to include components like video views for displaying local and remote streams. Ensuring that permissions are correctly requested and the UI is intuitive allows users to interact with the app smoothly, improving the overall usability and accessibility of the video calling features.

6.1. Request Permissions

Request camera and microphone permissions in your AndroidManifest.xml

XML
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />

6.2. User Interface

Design your UI to include video views for local and remote streams. Use a SurfaceViewRenderer for displaying the video.

XML
<org.webrtc.SurfaceViewRenderer
    android:id="@+id/local_view"
    android:layout_width="100dp"
    android:layout_height="100dp"
    app:layout_constraintBottom_toBottomOf="parent"
    app:layout_constraintEnd_toEndOf="parent"
    app:layout_constraintStart_toStartOf="parent"
    app:layout_constraintTop_toTopOf="parent" />

<org.webrtc.SurfaceViewRenderer
    android:id="@+id/remote_view"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    app:layout_constraintBottom_toBottomOf="parent"
    app:layout_constraintEnd_toEndOf="parent"
    app:layout_constraintStart_toStartOf="parent"
    app:layout_constraintTop_toTopOf="parent" />


In MainActivity, set up the SurfaceViewRenderer for the local and remote streams:

Kotlin
val localView = findViewById<SurfaceViewRenderer>(R.id.local_view)
val remoteView = findViewById<SurfaceViewRenderer>(R.id.remote_view)

localView.init(EglBase.create().eglBaseContext, null)
localView.setMirror(true)

remoteView.init(EglBase.create().eglBaseContext, null)
remoteView.setMirror(true)

webRTCClient.setLocalView(localView)
webRTCClient.setRemoteView(remoteView)

7. Testing and Deployment

Testing and deployment are the concluding processes where the developed application is checked on different devices to confirm their compatibility, efficiency, and stability. This step includes fixing of problems, improving the speed of the app, and testing all aspects of the application. Upon this, the application is ready for deployment in that it can be deployed in stores such as the Google Play Store. The reason for this is that after developing the application, it is necessary to perform its testing and further deployment in order to guarantee a high quality and convenience for the clients and partners.

Output :


Conclusion

In this article, the fundamental process of making a video call app with the help of Firebase and Kotlin as well as Firebase Authentication, Firestore, and WebRTC was explained. Additional features can be built into the project such as the profile of users and call logs. When using Firebase and WebRTC, it would be much easier – the former strengthens your app’s qualities and the latter improves the usability.


Next Article
Article Tags :

Similar Reads