Scalar's 'additional metadata' features have been disabled on this install. Learn more.
SeeChange Camera Model: Troubling Issues that Transparency lens Cannot See
Main Menu
Harinandrasana Angelos Randrianaina
2e6a339e394bfea892a11333cc3ef11182f1b1ac
TheCircleCover
1 media/the circle_thumb.jpg 2022-02-14T00:54:35-08:00 Harinandrasana Angelos Randrianaina 2e6a339e394bfea892a11333cc3ef11182f1b1ac 40002 1 plain 2022-02-14T00:54:35-08:00 Harinandrasana Angelos Randrianaina 2e6a339e394bfea892a11333cc3ef11182f1b1acThis page is referenced by:
-
1
/facial-recognition-transforming-retail-by-enhancing-in-store-customer-experience-improving-retailers-operational-efficiency-featured.jpg
2022-02-13T23:11:57-08:00
SeeChange Camera: Troubling Issues That Transparency Lenses Cannot See
63
The present piece is trying to reflect on SeeChange Camera model and its eventually troubling issues transparency-related, how that plays out in our world.
plain
2022-04-07T10:28:32-07:00
A quick research on the occurrence of certain terms in the book The Circle left me amazed. Apart from the title or the name of company The Circle, which occurs 243 times, the terms “transparency” and “camera,” respectively 62 times and 166 times, outnumber the term “technology” only 26 times. It might suggest the dominantly exciting topic of transparency in the book. One of the series of new programs introduced to the readers by the elite of the company has been the SeeChange Camera. It was first presented by Bailey Eamon, whose promising dreams were to install a marble-size camera to make the public and the world on display for transparency's sake. Bailey went on to expose several benefits of using that tech masterpiece such as the promotion of better behavior, the prevention of crime, and accountability. Mae was amazed during Bailey's bright presentation and conclusion, “SeeChange. This is ultimate transparency. No filter. See everything. Always.” (Eggers 69) It seems to promote omnipresence and omnipotence of technology. However, the use of SeeChange model seems to generate more troubling issues in terms of transparency rather than benefits. The present piece is trying to reflect on SeeChange Camera model and its eventually troubling issues transparency-related; how that plays out in our world.
Transparency itself might not be totally guaranteed. In the abstract of the article, Information is Power, Sora Park wrote, “in a connected world, information gains power through permanent storage and wide distribution.” This statement seems to imply that those who have the information have power on hand. The concern is now how the information is managed, i.e. shared or displayed. Is there any limitation to hold back the access if there is “No filter. See everything”? In the economy, when something becomes common, it loses its value. Yet with their algorithm, the big techs use and manage the information as their source of income and power, that would incite the big techs more selective in display.
Besides, transparency via SeeChange Camera is unethical because of rights violation as opposed to what Bailey suggested. People lose their right of privacy. Some people might not be aware of how much using a camera or live social media by Artificial Intelligence, might cost to users. Some might know but willingly overlook that fact. “You are never anonymous,” says Matt Mahmoudi, Artificial Intelligence & Human Rights Researcher at Amnesty International. Could we say: “Privacy is thef?” I remember what my professor said, “nothing is free” (of charge). One might use freely or with reduced cost something like social media platforms, but he/she/their might cost his/her/their privacy.
Transparency using SeeChange Camera does not tackle the root of moral issues such as crime, violence, or dishonesty. Yet, Bailey suggests that the cameras would be a deterrent. Speaking of a hypothetical soldier who considers abusing a female captive, Bailey states proudly that now that soldier "should worry. He should worry about these cameras." (Eggers 66) It is obvious that these acts are the tip of the iceberg of the society deep moral issues. Camera catches only physical in general. It displays only the appearance, not the deep inside, i.e. the intention. Having said that, the form of crime would change. For instance, to kill or harm, people might not use arms and weapons. Has one ever heard “language kills more than the sword”? Speech of hatred and division on social media or newspaper comments after watching or learning some facts. Haugen put it this way while testifying in front of the Congress: “The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people.” In addition, there are other aspects that transparency lens could not see. The progress in technology and science turns things more virtual, invisible, or tricky, out of camera control. Lies, theft, crime, war, violence, and dishonesty could be accomplished at very high levels. The more the technology gets advanced, the more immorality evidence becomes imperceptible. SeeChange Camera could easily provide physical evidence of an attack against a country in a context of war. How about using and spreading deadly virus, invisible to the naked eye, as biological weapon that could kill millions of people?
Another troubling issue of use of SeeChange model technology is safety. I recently saw in Twitter that Elon Musk offered $5,000 to Jack Sweeney to delete the account Elon Jet that tracks his every single move on Jet. This teen catches him via adsbexchange.com. This is a matter of security, he emphasizes. The similar risk is applicable to high profile or statesman. Bailey proudly addresses the accountability of politicians or leaders but deplorably overlooks the security issue. The crime could be reduced but still exist. The security is at risk. Plus, the divergence in ideology and politics leaves nations under high vigilance. This is still a utopia to have the world in peace and safety under the same ideology.
The use of SeeChange Camera might be used subjectively. The focus is more on those who control the technology, platform, algorithm, and devices. Matt Mahmoudi, Artificial Intelligence & Human Rights Researcher at Amnesty International, suggested that the worst would be that it could turn out to target a specific group of people or certain individuals. He continues, “Facial recognition can and is being used by states to intentionally target certain individuals or groups of people based on characteristics, including ethnicity, race and gender, without individualized reasonable suspicion of criminal wrongdoing.” China is, for instance, more advanced in using surveillance cameras. China has 9 out of 10 of the cities most surveilled, according to UsNews. The risk would be political when the government or politicians of the country could use that kind of technology as a weapon to oppress certain people of different ideology or standpoint. The opposite outcome is obvious: human rights abuses.
The SeeChange Camera program does not address the audience’s age differences. At some degree, there must be restrictions to accessibility to the information, image or video, for appropriate-age. A very simple stance is the use of social media platform nowadays. It might offer an interesting perspective of information accessibility in SeeChange Camera context. I was surprised to discover the rise of kids under 13 years old using Facebook (or Meta) products. Recently, we learned a lot about the impact of the social media on the audience. Through the 60 Minutes interview of the whistleblower Frances Haugen, she revealed how much teens suffer from depression using Instagram and yet how this later targets teens in their innocence. No need to mention the health consequences of being sedentary and passive while watching or using the platform. No need to say about addiction that this could entail. No need to say about psychological and emotional troubles.
The other issue of transparency is consent. Consent is necessary to watch the public and to use their information. In fact, installing a camera for one to be watched without consent is unethical.
Does it need the consent of the individual or of the majority to be implemented? It also might happen that the information stored could be improperly used or commercialized. In fact, information could be a significant source of income. The recent fact seems to prove the use of personal information in the social media realm without consent. Who did not know the Cambridge and Facebook scandal? The use of Facebook users’ information “to build voter profile.” In that sense, the owner’s behavior could be influenced. It deprives him/her/their to think and to choose. His/her/their freedom could be compromised. That’s what the novel has as its motto: “the privacy is theft.” On the surface, the mega company like The Circle could assure the public the security and privacy of the data, but deep down, the personal information could be used for other purposes.
To conclude, despite the fact that the technology, and SeeChange Camera in particular, offers enormous benefits, when it is about transparency, there are certain issues that transparency lens could not see. Transparency cannot be fully guaranteed via the SeeChange model. The model leads to rights violations. It also cannot tackle the root of moral issues. Even with high performance, it has its limitation by catching only physical images. Its use could be biased in a sense that it could be used to politically control and to oppress some people with different ideologies or to stereotype. It is challenging to respect the audience’s age and to get people’s consent. So far, transparency would remain a utopia unless those challenging issues were seriously addressed in every aspect possible. It also calls for fair regulations or policies to harmonize society and protect people's rights.