डीटीयू प्रो. दिनेश विश्वकर्मा अपनी टीम के साथ फेक न्यूज़, डीप फेक और हेट स्पीच को डिटेक्टर करने वाला एआई मॉडल बनाए.
Category
🗞
NewsTranscript
00:00In this time, we are here with the DTU Department of the IT Department, Professor Dinesh Kumar Vizsukarma.
00:07We know from Professor, which work is happening to us, and the AI model is in which way we can fix it.
00:13Thank you very much.
00:16Thank you very much.
00:46Thank you very much.
01:16Thank you very much.
01:18Thank you very much.
01:20Thank you very much.
01:22Thank you very much.
01:24Thank you very much.
01:26Thank you very much.
01:28Thank you very much.
01:30Thank you very much.
01:32Thank you very much.
01:34Thank you very much.
01:36news we can check that this is fake or this is real news.
01:42For that, our algorithms work.
01:46For that, the main purpose is where we can check that news is authentic and where is a portal
01:57that we can say that it is possible that it is wrong because we cannot say it wrong
02:03until we do not prove it.
02:06We did research on this research.
02:09We know that whenever we say something wrong,
02:13the way that we speak is smart,
02:18but sometimes it is such a thing that we know.
02:21So cognitive science says that when we say something wrong
02:25and we say something wrong,
02:27then the level of confidence,
02:30there is something wrong.
02:33So where we feel like we say hate speech,
02:37that no one is wrong,
02:39then it is seen that the lip movement is the same.
02:43These changes are the same,
02:45that the lip syncing is the same.
02:48So we see these things,
02:50which we try to know,
02:53that this may be a hate content.
02:57Now deep fake,
02:59this is a very threatening technology
03:02which has been very popular for some days.
03:07We have done a lot of research.
03:09We have done a lot of research.
03:11And now this research is now popular.
03:12We have done a lot of research.
03:13We have done a lot of research in the last year,
03:15two years.
03:16But we have done research in the last five years.
03:19Almost more than five years.
03:21Even in this field in this field.
03:24So when we saw this,
03:26it was the most important thing
03:28that this technology is very much misuse.
03:30This side of the positive and negative side
03:33is not much explored.
03:35But all the negative side is going to explore.
03:38So the first time came that
03:40this technology is the most important thing
03:42in pornography.
03:44And the popular celebrities
03:47can threaten from this type of technology.
03:51The popular leaders can threaten from this type of technology.
03:53And now we are seeing these things
03:56in the current days.
03:57There are many celebrities
03:59who have been targeted.
04:01Some politicians have been targeted.
04:03So this society has been very negative.
04:08And this is AI,
04:10the bad part of it is
04:13that people have been in AI.
04:15We have built up this way.
04:17This is the way that
04:18the fake and real
04:20is very difficult.
04:22It is a normal users.
04:24So for this,
04:26we have built up this algorithm.
04:29We have built up this.
04:30In the language of the algorithm,
04:31we have mentioned in the language.
04:32There is a computer program
04:34at the point of it.
04:36We can check any content from it.
04:37We have built up this system.
04:39We have built up this algorithm.
04:40And in the context,
04:41we have built up this algorithm.
04:42We have built up this algorithm.
04:45We have built up this algorithm.
04:47we have made the data we collect in the AI models
04:51we have created a face detection
04:55we have made the algorithm automatic identify
04:59so this happens to be in which way we are using
05:03the first approach that was the feature based
05:07which was the machine learning based
05:11now the concept of deep learning is more growing
05:15in that way we don't have the role of the face detection
05:19we can easily do it first
05:21we used the feature based approach
05:23so what is it that if we look at
05:27different geographical regions
05:29then some people have different skin tone
05:33and some people have different skin tone
05:35so that all the entire universe
05:39should cover us in the data
05:41so we don't have the data
05:45where people have different
05:47biographies
05:49and in that way
05:51we have the data
05:53so we try to cover the data
05:55worldwide
05:57then only we use it
05:59we use it
06:01and if it happens to people
06:03in research
06:05that people are doing it
06:07then people create data
06:09they are a little reluctant
06:11but it is that
06:13we are doing research in India
06:15so the role of India
06:17which is the role of the youth
06:19which is the role of the youth
06:21they are a lot of things
06:23that they are making
06:25but they are saying
06:27that before building
06:29our resources
06:31such as data sets
06:33to generate it
06:35and create it
06:37so that in our Indian scenario
06:39how the algorithms are performing
06:41and how the programs are working
06:43and how the programs are working
06:45for those things
06:47and now there is a lot of work
06:49and we hope
06:51in the time
06:53that it will be a good job
06:55the AI algorithm
06:57which you have developed
06:59is now in which form
07:01and it will be used for public
07:03in the form of app
07:05or in the form of website
07:07what is it
07:09you have asked a very good question
07:11if there is
07:13that we are researching
07:15so our objective
07:17in India
07:19that people are researching
07:21and publish it
07:23and after publishing it
07:25after publishing it
07:27they say that
07:29it is okay
07:30and it is still
07:31in India
07:32the researches are doing more
07:33in India
07:34but the scenario
07:35is now changing
07:36so
07:38when we research
07:39we see
07:40what is the work
07:42and what is the work
07:44and what is the work
07:46we see
07:48in which form
07:50it will be delivered
07:51in the form
07:52so
07:53for example
07:54there is
07:55fake news
07:56there is
07:58hate speech detection
08:00or deep faith detection
08:01so
08:02for this
08:03we see
08:04we have
08:05we have to make
08:06a application
08:07like
08:08web application
08:09or
08:10we have to make
08:11an app
08:12which is
08:13what is the work
08:15we can check that in this situation
08:18we can do more work
08:20because we focus more on product waste research
08:24so that our country will become our products
08:27and the country will be more useful
08:30and we can use it here