You are here
Anti-wining software; when ICT meets morality
Carnival is here again. The skimpily-clad will be on parade. They will be drooping under the scorching heat but will perk up when they recognise that the cameras are focussed on them. They will then be re-energised, seeking to out-gyrate each other.
The law that prohibits lewd and vulgar dancing will be resolved into an immediate hiatus, to accommodate the reign of the Merry Monarch. Republican status naturally becomes irrelevant. The unofficial “officially sanctioned” Carnival holiday becomes manifest. To use scientific terminology: a singularity, of the legal variety, occurs as there is a discontinuity in the application of the aforementioned law.
There has been sustained criticism, over the years, from some quarters for allowing and apparently actively facilitating this contravention of what is considered to be proper and moral behaviour in the public space. Many are of the view that during the festival, the line separating lewdness and pornography becomes blurred. It would appear that human-based surveillance is unable to take effective action. Maybe technology, specifically based on artificial intelligence, can provide a solution. How about anti-wining software for cameras and patrol robots!
Technology is generally perceived as being “morally neutral” but it has been exploited by the criminal and anti-social elements for their own immoral and illegal ends. The airing of revenge sex videos and pornography on the Internet are clear examples. Some sort of control is needed at both the filming level and the airing level. Maybe the time has come for ICT (Information and Communications Technology) to introduce some basic level of AI (Artificial Intelligence) based morality software.
For this to happen, video activity has to be interpreted. Object recognition is a well-established technology, an example of which is facial recognition software. There is a variety of other software available for analysing movements. Examples of these include those used in sports, medicine and security systems. These however require human intervention for their interpretation. What would be required is some level of human-like automatic decision-making software.
This implies AI based software. Many of the better video cameras have inbuilt AI based software to negate the effects of hand shaking when filming. Virtually all have other focusing and optimising software to produce high quality pictures. So AI based inbuilt software is fairly standard in consumer electronics. The interpretation of action scenes and videos is needed if control is needed for the control of video filming/recording and airing. There is promising news on this front.
Researchers at Google and Stanford University, working independently, have developed AI software for recognising and describing the contents of photographs and videos with a level of accuracy that sometimes mimics human levels of understanding. It is reported that the software teaches itself to identify entire scenes. The examples given were a herd of elephants marching on a grassy plain and a group of young men playing with a Frisbee. These scenes were analysed and described, quite accurately, in English.
Possible applications considered included the searching and cataloguing of the billions of images and hours of videos available online, helping the blind navigate and facilitating the movement of robots in natural surroundings. Of course surveillance is a big potential one.
The technical basis for automatic censorship of lewd and pornographic behaviour thus exists. So should not software be developed and embedded into cameras that recognise activity that is or simulates sexual activity and either closes the shutter or does not record the activity? If this is effected, one can certainly anticipate a dramatic reduction of online sex videos.
The time may well be now for a global movement to insist that YouTube videos are subjected to an automatic online censorship process before being aired. This would not only stop pornographic videos but, with anti-beheading software, spare the world the trauma of the horrific videos being posted of late.
User comments posted on this website are the sole views and opinions of the comment writer and are not representative of Guardian Media Limited or its staff.
Guardian Media Limited accepts no liability and will not be held accountable for user comments.
Guardian Media Limited reserves the right to remove, to edit or to censor any comments.
Any content which is considered unsuitable, unlawful or offensive, includes personal details, advertises or promotes products, services or websites or repeats previous comments will be removed.
User profiles registered through fake social media accounts may be deleted without notice.