Over the last few years technology has advanced enough to facilitate the most ambitious of crazy ideas which earlier generations never thought would be possible. In particular Neural Networks have helped realize some of the most ambitious inventions ever like self-driving cars, smart assistants and health-care diagnosis which have improved quality of human life signficantly.
However, the competition and drive for higher profit has resulted in the oversight of ethical considerations associated with technology. The last few years have witnessed a blatant disregard for privacy of users/consumers from tech giants like Google and Facebook. While this is a serious concern, there’s now a more dangerous trend of developing software that infringes on basic human morality. Technology which makes you wonder why they were even built in the first place.
Recently a developer by name Alberto built an AI using Neural Networks that generates nudes of woman from clothed pictures of her. Within hours of release the paid app was downloaded over 500,000 times. The app which was designed to work only on images of women reflects the level of misogyny and objectification the developer AND its users have stooped down to. Within days of release, hackers had reverse engineered the algorithm and open-sourced the code on github. Although the app & source were taken down after some much needed backlash, we will never know how many copies are circulating online. When asked why, Alberto said he was excited by the possibility of X-ray glasses.
VoCo is a software that can alter voice recordings to include words or phrases never originally uttered by the speaker. All that is needed for this Neural Network developed by Adobe Research, is a 20 min recording of the speaker.
“We have already revolutionised photo editing. Now it’s time for us to do the audio stuff,” said Adobe’s Zeyu Jin after the appalling demo. A company of talented individuals who failed to evalute the ethical issues. Court testimonials, Media, Journalism, Hell - even Voice Calls… Nothing can ever be trusted again.
Boeing 737 Crash
The Boeing 737 crashes that took away the lives of 100s of people was the direct consequence of conscious poor design choices by the engineering team. Boeing was under pressure to match its competitors’ latest engine. Their newly developed engine couldn’t be installed on existing planes due to its larger size. Instead of changing the design, they mounted the engine higher up on the wing messing up the aerodynamics of the aircraft. This would tilt the aircraft upwards causing it to climb up. They developed a software solution, MCAS which would force the plane down whenever this was detected. The pilots weren’t trained about MCAS as the engine needed to be advertised as an upgrade which requires no re-training.
With a slew of poor engineering choices already in the bag, they decided to use only 1 sensor to feed MCAS rather than the 2 available. Since these sensors are not very robust, redundant sensors are usually used to avoid false-positives from erroneous data. But not Boeing. The safety system that analyzed both sensors to indicate errors was offered as a “paid addon”. In-flight, MCAS wrongly sensed that the aircraft was climbing up and chose to lower a perfectly stable aircraft causing it to crash. With no security system in place and pilots having no knowledge about MCAS they watched the aircraft crash.
Would you or your family use what you built ?
Would you have someone use the DeepNude algorithm on photos of your daughter ? Would you travel on a plane that doesn’t meet safety requirements? Would you or any of your kin use a commodity knowing that it infringes on the privacy & security of users ? If the answer is no(and I hope it is), why release it and prey on ignorant consumers. Make no mistake. This isn’t about software bugs or human errors but of conscious choices premediated to infringe on or blatantly ignore basic human rights and morals for the sake of profit. We the developers, researchers and engineers as creators have the responsibility of designing technology that doesn’t elicit any harm.
Need for Policy & Regulation
The internet and software expanse is vast and really has no limits. Anything released on the internet is never truly gone. Speaking from a technical perspective, we can not control the distribution of anything after its release on the internet. We need preventive measures. With an AI revolution predicted, there’s an urgent need for new policies & regulations that control development/usage of technology. Software developers and engineers need to step off their geeky bubble and pay heed to the ethical considerations of what they build. A lot of ideas that sound very exciting need to stop at just that.
I keep coming back to the same questions. If my rifle claimed people’s lives, can it be that I…, an Orthodox believer, am to blame for their deaths, even if they are my enemies?
– Mikhail Kalashnikov - Inventor of AK-47