Information Technology

The Rise Of Ethics In Artificial Intelligence (Part 1: Privacy)

<p>As artificial intelligence (AI) is starting to consume our lives, ethics has never been more important than it is today. We are moving towards an era, where for the first time, experts are contradicting each other towards the uncertainty of our future. Considerations around ethics need to be at the forefront of the brightest minds who will be developing the algorithms and machines of the future. In the next three articles, I will take a snapshot look at the rise of ethics &ndash; starting with privacy and then followed by content ownership, the implications of AI, what is being done to ensure trustworthy AI, and some interesting areas to keep up to date with. All thoughts, views, and comments are my own.</p><h2>The Truman Effect</h2><p>The very first breath of today&rsquo;s generation is almost acceptance of the terms that our face will be immediately photographed and presented to the social realms for those all exclusive &lsquo;likes&rsquo;. We&rsquo;re pretty much born without a decision, but does it get any better? As we roam the streets, do our shopping, drive our cars, or pop into our neighbours for a cup of coffee, our faces are becoming increasingly captured and recognized by surveillance cameras. How many times would you think, on average, you are caught on surveillance cameras per day? Well in the United States, it is estimated that the average citizen is caught up to 75 times per day on surveillances cameras, whereas in London, it is estimated that the average person is caught up to 300 times per day. One can argue that these cameras are evading our privacy whereas others will argue that these are keeping us safe &ndash; either way, we need to look for ways that maintain our privacy whilst also keeping us safe.</p><div class="slate-resizable-image-embed slate-image-embed__resize-full-width"><img src=";v=beta&amp;t=3AZhpketTH9lOzH9YmJk-d1F7FPzOrLCivHfcQxGFag" alt="No alt text provided for this image" data-media-urn="" data-li-src=";v=beta&amp;t=3AZhpketTH9lOzH9YmJk-d1F7FPzOrLCivHfcQxGFag" /></div><p><em>Figure 1: The Truman Show</em></p><h2>I&rsquo;m a Good Citizen</h2><p>In one of my previous articles &ldquo;We Are At War... A Cyber War; With 20 Billion Connected Devices. Just How Safe Do You Feel?&rdquo;, I discussed how, in China&rsquo;s move towards a cashless economy, they had reached almost $13 trillion in mobile transactions in 2017, which has increased recently to $41.51 trillion in 2018 (China Daily, 2019). More and more of our lives are becoming digitalized, so much so, that China has recently started scoring a citizen's social credit&rsquo; based on whether they have committed any minor offenses. In 2018, this &lsquo;trustworthy&rsquo; scoring (Independent, 2016) stopped 17.5 million people from buying airplane tickets, 5.5 million from hopping on a train, 290,000 people from getting a high-paying senior management job, and 128 from leaving the country because they had not yet paid their taxes (Fortune, 2019). The concept of this is extremely interesting, but do you think it is a good idea? Will the result of this force/encourage people to be better citizens or is it placing too much power in the government&rsquo;s hands? What sort of catastrophes could be caused if there was a data breach &ndash; could a person be essentially removed from ever existing?&nbsp;</p><div class="slate-resizable-image-embed slate-image-embed__resize-full-width"><img src=";v=beta&amp;t=paaE9BUiys8ACRsBDwI0V6dzexMnCST_4CWJbp10aos" alt="No alt text provided for this image" data-media-urn="" data-li-src=";v=beta&amp;t=paaE9BUiys8ACRsBDwI0V6dzexMnCST_4CWJbp10aos" /></div><p><em>Figure 2: China&rsquo;s Social Credit System</em></p><h2>How Did You Know I Would Like That?</h2><p>Is this familiar&hellip; you are browsing online for a product, and the next day you are on a completely different website and all of a sudden you notice an advert that is displaying a similar product that you were looking at previously? Is this just a lucky coincidence, or is it that your browsing profile and data have been shared across different sites? The sharing of our personal data is being used to drive extremely personalized campaigns and customized services, but where do we draw the line between the best digital experience vs. violating our privacy?</p><p>Now take this digital experience to your physical stores. EyeSee mannequins, which use embedded cameras to determine your gender, race, age, and facial expressions in order to understand whether you like what you are looking at, are now being deployed in stores (Techno Fashion World, 2013). This information combined with location information from your mobile device could potentially be used to establish whether you made a purchase. This powerful information can be used to determine how to rearrange a shop, what clothes to put together, and also what highly personalized marketing to send afterward. So the next time you look a mannequin and think, &ldquo;Oh I like those clothes&rdquo;, have a little think and wonder is the mannequin thinking the same about you. She may not be as lively as Kim Cattrall, but she could be watching you.</p><div class="slate-resizable-image-embed slate-image-embed__resize-full-width"><img src=";v=beta&amp;t=K1-KBiiLwVQJw2WHDOxXm1xo7x1Q8nrY-DLUCE3Pl_M" alt="No alt text provided for this image" data-media-urn="" data-li-src=";v=beta&amp;t=K1-KBiiLwVQJw2WHDOxXm1xo7x1Q8nrY-DLUCE3Pl_M" /></div><p><em>Figure 3: Eye See Mannequin</em></p><h2>Terms and Conditions May Apply</h2><p>How many of us actually take time to read the terms and conditions before we accept them? A recent survey by Deloitte found that 91% of people consent to legal terms and conditions without reading them, which increases to 97% of people when considering the ages between 18-34 (Business Insider, 2017). There&rsquo;s a good example of this from PC Pitstop, who hid a $1000 prize in their terms and conditions and all the reader had to do was email them. It wasn&rsquo;t until 5 months and 3,000 sales later that a user actually noticed it, emailed the team, and claimed the prize (TheUIJunkie, 2012).</p><p>Companies such as Facebook and Google have been criticized for having long-winded, misguiding, and unfriendly language within their terms and conditions. A recent study found that terms and conditions belong to some of the most commonly used websites and applications are almost unreadable to 99% of people, with some being compared to academic journals requiring over 14 years of education to understand (Vice 2019). This is proof that tech companies are making it harder and harder to use their services without giving up your personal information and data.</p><p>In the documentary &lsquo;Terms and Conditions May Apply&rsquo;, it was said that Google is essentially a $500 a year service, because that&rsquo;s the value that is placed on the data that you provide. More and more companies are now gathering these types of data points and learning about our interests, fears, secrets, family, and friends&hellip; and we pretty much agree to the majority of it (IMDb, 2013). One of the biggest potential issues is that this data could be used as an input to an automated flagging system based on your search and online behaviours, a similar concept to that of the film Minority Report, and there has been some examples of this in the past. In 2012, two young tourists were refused entry to the United States on the security grounds that he previously tweeted saying he was going to &ldquo;destroy America&rdquo; &ndash; by which he meant &lsquo;get drunk&rsquo;... millennials huh (BBC News, 2012). In 2009, New York comedian Joe Lipari, ended up with two felony charges and spent a year proving he was not a terrorist after he was unhappy with the assistance he was given at an Apple store and paraphrased a quote from the film Fight Club on Facebook (Engadget, 2011).&nbsp;</p><div class="slate-resizable-image-embed slate-image-embed__resize-full-width"><img src=";v=beta&amp;t=8dRz5PBNdqrjWE-o10QDPkXzXdizKvxTGOONXw4g1sA" alt="No alt text provided for this image" data-media-urn="" data-li-src=";v=beta&amp;t=8dRz5PBNdqrjWE-o10QDPkXzXdizKvxTGOONXw4g1sA" /></div><p><em>Figure 4: Terms and Conditions May Apply</em></p><h2>Final Thoughts</h2><p>In 2011, Marc Andreessen stated that software is eating the world and in order to survive, every company must be a software company (A16z, 2011). It is almost inevitable that the convenience and enjoyment that our connection to the plethora of connected devices and application bring us, lead to the intrusiveness to our behaviours and patterns. Although the sharing of these types of information does come with advantages such as the accuracy and efficiency of Google Maps (which is said to have 20 million user contributions of information every day &ndash; that&rsquo;s more than 200 contributions every second [Geospatial World, 2019]), is the tailoring of services based on your purchasing patterns and behaviours really that bad? Or is it the randomness that enriches our lives? In the casual flow of life, if we never get a chance to partake in the art of browsing and are denied the opportunity to stumble upon something new, has our freedom been taken from us&hellip; our freedom of choice? We are now at a stage where almost every company is a data company. Data has become king and it looks like we must learn to live in the kingdom. In my next article 'The Rise of Ethics in Artificial Intelligence (Part 2: Content Ownership)', I will talk about some areas in which AI is generating content, and the implications that this has in terms of ownership.</p><p>Until next time, I hope you enjoyed the read.&nbsp;</p>
KR Expert - Graham Baitson

Core Services

Human insights are irreplaceable in business decision making. Businesses rely on Knowledge Ridge to access valuable insights from custom-vetted experts across diverse specialties and industries globally.

Get Expert Insights Today