Live Streaming Video Chat App without voice

Task Description

  • > Create Live Streaming Video Chat App without voice using cv2 module of Python.

->In this Task, we have created one Live Streaming Video Chat Program using cv2, socket, numpy module of Python. Here, I am using Windows as Client program and RHEL8 as a Server program in which client will connect to server using TCP(Transmission Control Protocol).

->As we have to use cv2, socket and numpy modules, we first need to install them on both Windows and RHEL8 using “pip”(windows) and “pip3”(RHEL8) and then import them in client and server programs. In my case, I am using Jupyter Notebook as an IDE for both the systems. If you want to run jupyter notebook on RHEL8, then you first have to install it using “pip3” command.

pip3 install jupyter

Then run below command in order to launch jupyter notebook as ->

jupyter notebook -- allow-root

->Now, as I am using TCP protocol then while creating socket, we have to specify keyword called “SOCK_STREAM”. I have connected my external webcam to Windows i.e Client program and internal webcam to RHEL8 which is our server program.

->On Client side, I have used “connect()” function so that it will establish or creates a session with server as client will send request to server.

->On Server side, I have first used “bind()” function to bind IP and port no of server which will acts as program name for remote clients and “listen()” function to put your server program in listening mode which we can check using “netstat -tnlp” command. Then, finally in order to accept the request of the client we used “accept()” function. This function will returns 2 informations:- (i) Session/connection function which will be further used by server program to send or receive data to/from the client program. (ii) IP and port no of client program. For connecting to internal and external webcams, we used “VideoCapture()” function and provide argument as “0” for connecting to internal webcam and “1” for connecting to external webcam.

->Now, from client side we will start sending image to server. Technically, a video is nothing but a sequence of images arranged so as it appears like a continuous stream to us. For clicking image, we can use “read()” function but before sending this image over network to server we first have to encode it into bytes format. If we directly send raw image to server, then server might not be able to decode it correctly.

->That’s why, for encoding this image we have to use “imencode()” function that also comes under cv2 library. imencode() will convert this image into streaming data i.e tuple and store it into Buffer/cache memory. Cache Memory is used to temporary store particular data so that it can be fetched by some other program. Its syntax is given as->

cv2.imencode(ext, img[, params]) ->retval, buf

Here, ext-> extension by which we want to store our image.

img-> Image that we want to convert to bytes

  • >This function will give output in form of tuple ‘( )’ which will contain the return value as either true or false along with the 1D array. As we want to only transmit array part over the network, that’s why I have used index as ‘[1]’ in order to retrieve array and then converted it into bytes format using “tobytes()” function.
  • > Now, for sending this image converted into bytes stream over the network to server program, we can either use “send()” or “sendall()” function. we know that send() function is used in TCP protocol and sendall() for UDP protocol but the main difference between both these functions is that socket.send() is a low-level method and it can send less bytes than you requested, but returns the number of bytes sent. So, there is loss of bytes while transmitting over network. But in case of sendall(), socket.sendall is a high-level Python-only method that sends the entire buffer you pass or throws an exception. It does that by calling socket.send until everything has been sent or an error occurs. That’s why in my case we will use sendall() function. It returns None value after sending all data.
  • > Now, Server program will receive this stream of bytes using “recv()” to which we will provide approx. size of bytes as argument. This bytes stream is now stored in cache/buffer memory. In order to retrieve all the bytes stream from buffer memory, we have to use a function that comes under numpy module is called as “frombuffer()” and has following syntax:-

frombuffer(buff, dtype, count, offset)

Here, buff-> variable in which stream of bytes is stored.

dtype-> data type into which we want to convert this byte stream. By deafult, it’s value is float but in my case, I want to convert it into 3D array and for that I will convert it into “uint8” data type.

count-> Count number of bytes in the buffer memory.

offset-> It is used to specify the limit from which function will retrieve bytes from buffer memory. It’s by default value is 0.

But, frombuffer() always return output as 1D array. Now, finally in order to

finally convert this 1D array into image i.e 3D array, we have to use “imdecode()” function. It will take this 1D array as input and convert it into color image and for this we have to specify a flag that comes under cv2 library called as “cv2.IMREAD_COLOR”. It specifies to load a color image. Any transparency of image will be neglected. It is the default flag. Alternatively, we can pass integer value 1 for this flag.Syntax of “imdecode” is as follows:-

cv2.imdecode(buf, flags) ->retval

Here, buf-> 1D arrray that we are giving as input.

flags-> Format in which we want to load our image.

imdecode() returns value as 3D numpy array i.e a color image as specified by flags. But, suppose in case if data is not received properly, then imdecode() will return empty or null as output.

Now, finally for displaying the image which was send by client program, we have to use “imshow()” function to which we can specify label or heading for our image and then specify 3D image that we want to display as an argument to this function. In order to hold the image window, we have used “waitKey()” function and for terminating or killing this window, use “cv2.destroyAllWindows()” function.

Now, server will click and send the image to client and then client will do the entire same process that server has done till now and thus their connection will go on untill we interrupt.

Finally, in order to disconnect from webcam, I have used “release()” function. Refer to below specified GitHub URL for the code of client and server program and also the video attached below.

GitHub URL:-

Thank you for reading !!

--

--

--

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Web Scrapping in Python: Automating Instagram Log In

You can build it. We can help you. www.grandhomedesign.com

What are your architectural drivers for adopting multi-cloud?

Pwncat — Fancy Reverse And Bind Shell Handler

How to provision AWS RDS for Postgres using Crossplane

AWS Lambda Layer Creation Recipe

Achieving Async/Await in the Android Wasteland

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Aditya Pande

Aditya Pande

More from Medium

How POET Technologies’ Solutions Can Adapt and Grow with A.I. Technology

START-UP GUIDE TO MACHINE LEARNING

Analysis of Anime Recommendations Dataset

List of 5 Best CRM & Enterprise Loyalty Platforms 2022