C2C Digital Magazine (Spring/Summer 2023)

A practical framework for ethics in educational technology

By Gabriel Schott, Co-Director of Bands, Pella High School, and JaeHwan “Jay” Byun, Associate Professor, Wichita State University
 

Introduction


The ethic of reciprocity states, ‘Do unto others as you would have them do unto you.’ This golden rule encapsulates thousands of years of wisdom that transcends the boundaries of culture, religion, and creed. Upon further examination, this ethical blueprint can be divided into four specific pillars for living in harmony with one another and our world: 1) Do no harm to yourself. 2) Do no harm to your property. 3) Do no harm to others. 4) Do no harm to others’ property. In our increasingly complex, technologically driven world, surrounded by an infinite amount of data and knowledge, these four pillars provide a practical framework worth considering within the contemporary conversation on ethics in educational technology.



 

Ethics for Educational Technology


The four pillars represent a philosophical summation of rules for living within a society, designed to protect oneself and one’s tribe while simultaneously protecting others. Ethical frameworks value the interaction of people, ethical principles, and societal values (Spector, 2016). Significant research has been conducted on ethics as it applies to educational technology. Professional organizations such as the Association for Educational Communication and Technology (AECT) and the International Society for Technology in Education (ISTE) advocate for ethical responsibility and maintain standards that serve as templates for our profession. The more we know about the varied components of a subject, the better able we are to expand our body of knowledge and define our ethical responsibilities. Bodies of knowledge grown out of the collection of shared experience “provide a common vocabulary and knowledge inventory to aid communication and encourage shared values and practices.” (Thornley, 2021, p. 150).

While we, living in the 21st century, are still waiting for the first completely new, innovative breakthrough in technology (Winchester, 2010), it is fair to say that the pace of development and evolution of new platforms and applications, as well as the refinement of existing technologies is barreling forward faster than ever recorded in history. These refinements unlock incredible potential but make it difficult to keep up with anticipating the unintended consequences they tacitly introduce.  

Disruptive technology is also causing an increasingly wide range of professions to integrate technology to expedite their services, improve customer experience, and increase the bottom line (Noor, 2005). Consider the trucking industry. Brian McLaughlin, president of PeopleNet, a fleet management software service, has a vision for how technology is going to shape the role of ground transportation in today’s on-demand economy. He coined the term “Internet of Transportation Things,” which refers to a connected supply chain that links together trucks, drivers, freight, and, most importantly, intelligence to improve environmentally conscious safety and efficiency (Kilcarr, 2017). Perhaps it is fair to consider whether ethics in educational technology extends to the growing training field required by industries outside the scope of what we previously considered ‘educational.’

“The World Wide Web is nothing more than a mirror of the society, although clearer than all the previous ones, which is why moral “sins”, which could have been hidden before, can be seen now,” (Pivec, 2011, p. 65).


The other relevant component to this conversation is the complex web of existing and emerging devices, software, and applications and how those technologies are selected and incorporated into educational contexts. Many schools have adopted a 1-1 classroom where each student is issued a device such as a tablet, Chromebook, or laptop. While the effectiveness of access to technology and the increased educational outcomes cannot be denied, other issues related to economy, content, literacy, pedagogy, and community create a digital divide (Mahfood, et al, 2005) that requires the application of ethics. A sharp increase in informal learning facilitated by social media influencers, oftentimes devoid of fact, poses a new challenge when considering ethical standards. Social media itself changes how we communicate and relate to each other. New uses for learning analytics, digital game-based learning, microcomputers, AI, 3D printing, and virtual reality open the door to new ethical questions not yet explored.

“Ethical problems caused by the use of ICT increase proportionally to the growth of the social influence of ICT” (Pivec, 2011, p. 62).


Four rules for educational technology





Figure 1. Four Pillars for Ethics of Educational Technology

Despite the complexity of our cause, the simplicity of the four pillars allows them to encompass but not exclude all these different frames of reference when considering ethics in educational technology. They are not limited to a particular industry or age group, and the tenets of each could be presented in a scaffolded hierarchy across multiple levels of a particular context. Additionally, they are standards, universally open-ended enough to bridge the differences in ethical values and perspectives across regions and nations (Parsons, 2021). What follows is a simple framework that could serve as the basis for establishing enduring, ethical principles that guide the practices and policies for educational technology. Through examining existing research, the most referenced ethical issues in educational technology today have been identified, and a solution for addressing these issues will be presented through the lens of this framework.

“How a profession cares for those whom it serves is what counts for its professional ethics” (Yeaman, 2011, p.14).

 

Do no harm to yourself


The first pillar focuses inward and examines the behavior of the self as a digital citizen through the lens of self-protection. It is common practice for one’s personal, social, and academic identity to be stored and expressed through the internet. This trend translates into dozens of online accounts across a variety of platforms. We must agree that it is of utmost priority that all digital citizens have access to multiple pathways for learning how to maintain one’s privacy online through account and password management, internet access, and secure digital storage of various file types. Additionally, how we represent ourselves online and our standards for accuracy and integrity protect us from harm. We must emphasize teaching the inherent consequences caused by the distortion of meaning that the absence of physical connection in virtual conversation introduces to our lexicon.

In a way, this tenet always encourages us to keep up with the best practices as an exercise in curiosity. Technology has become so complex that not knowing what we do not know is a disadvantage. Corporations, even in the field of educational technology, knowingly or unknowingly misuse or sell data. The process required to access devices, apps, and ed tech tools often puts the user in a position of accepting terms they do not fully understand. Misused learning analytics can lead to ethical dilemmas and even lawsuits (Caines, 2022). Internet users become “a product for utility” when internet company ethical policies allow for influence in “consumerism, political agendas, and international influence” (DePriest, 2019).

In an era of big data, our online activity leaves a trail of personal information that is often collected without a user’s knowledge. While laws have started to require data collection disclosures and opt-outs, users often lack the technical knowledge to understand the fine print of such agreements fully. Digital citizens ought to advocate for user privacy, regulations that anonymize data, decentralize user anonymity and require the development and deployment of technology that increases user awareness of privacy risks and allows the user to be in control of what data can be shared (Vicenç, 2022). With many Virtual Private Network (VPN) providers now offering affordable and accessible options, VPN use to encrypt internet activity across devices serves as a protective measure that masks one’s IP address and keeps data private (IntelligentHQ, 2022).

Finally, there is another way that we can unknowingly harm ourselves through our interaction with technology. According to Statista, the average American spends four hours and twenty-three minutes on their phone daily and picks up their phone an average of ninety-six times per day (Ceci, 2022). We have become increasingly dependent on our devices to provide entertainment, communication, and information in a way that can affect our general personal well-being, often without our immediate realization. “Smartphones are both extrinsically and intrinsically rewarding, but the problems referred to smartphone use are not intrinsic to the tool itself, but to the dysfunctional approach people develop towards it” (Mancinelli, et al., 2022, p. 2). To do no harm to ourselves, we must collectively become more aware of our dependence on technology and begin to formulate ethical standards that position us to be more mindful, cautious, and curious about how technology can affect us personally on every level.
 

Do no harm to your property


The second pillar emphasizes protection of individual property through the implementation of proactive measures to keep our devices secure. In this context, property includes our hardware, devices, personal data, and intellectual property and the way they interconnect among the internet of things. Safeguarding one’s property requires teaching proper interaction with and respect towards the care and management of devices. Proper storage, transportation, handling, and cleaning practices protect the longevity of our devices. Keeping software up to date, as well as maintaining proper security, network configuration, and system settings, help defend our devices and, in turn, our data from becoming compromised.

Learners must be equipped with the knowledge to recognize network attacks and malicious attempts to gain access to devices and data. Viruses, malware, phishing, and ransomware are all examples of consequences to decisions made by users that lack the knowledge of how to detect and mitigate attempts at unauthorized access into our systems. Services such as KnowBe4 offer practice emails that an IT administrator can send to an organization’s employees that mimic some of the common phishing attempts. A report can then be generated that provides metrics about how many employees were able to catch the suspicious link and how many clicked on it. Low-stakes, realistic practice combined with knowledge of what the different types of attacks are, what they look like, and how to avoid them would go a long way toward equipping digital citizens to protect themselves and their property.

With the rapid development and deployment of Artificial Intelligence (AI) models that use machine learning, adversarial machine learning, and deep learning to collect, analyze, and process data to solve identified problems, new AI solutions exist that can help users recognize and defend against viruses, malware, phishing, and ransomware (Patil et al., 2021). Supporting the implementation of and universal access to such AI-enabled protection measures is essential in equipping digital citizens with the tools they need to protect themselves and their property.
“Remote functioning without any feedback does not mean that we do not cause damage, guilt, or pain to someone” (Pivec, 2011, p. 65).
 

Do no harm to others


The third pillar focuses outward on others and considers how interaction with digital technology influences the real lives of other people. Just as personal privacy measures protect us from the outside, respect for the digital privacy of others demonstrates the value of human integrity. It protects us from decisions that may have unintended consequences down the line. Too often, we hastily post thoughts, feedback, and comments without considering the medium through which we are trying to communicate. We have an ethical responsibility both to ourselves and others to ensure that the words we choose have meaning and clearly communicate our intent when communicating digitally, devoid of the body language and context that human interaction provides. The perceived anonymity afforded by the virtual distance of online interaction opens the door to deliberate attempts to threaten, intimidate, coerce, or offend others, behaviors devoid of integrity, often referred to as cyberbullying or trolling. In a digitally interconnected world, it is an ethical priority that we continue to refine standards for ethical conduct and invest in training and development solutions that advocate for the value of respectful communication and conduct.  

Artificial Intelligence is proving to be a technological revolution with enormous social impact. Having transitioned into its permeation stage where access becomes cheaper, more available, and more standardized (Moor, 2005), it is now easier than ever to generate written content on any subject imaginable, images and graphics, audio narration and even video. There is no doubt that this revelatory technological development will change how we teach, interact with technology, and learn. We must understand and accept, however, the limitations. AI is only as good as the data used to train it. One of the most popular AI apps, Chat GPT, learns through user interaction which can lead to an inaccurate body of knowledge. Students have access to leverage AI to complete their assignments or to generate information used in their research. Without understanding the limitations, we may inadvertently plagiarize others’ work or spread false information using AI (Trumbly, 2023). We have a duty to ourselves and to others to ensure that the content we consume, create, and share, is factual. We must work to recognize our own biases while operating with respect for the viewpoints of others. This revelatory technology has created a “policy vacuum” through which new laws, rules, and customs are required to minimize its negative effects (Moor, 2005).

In October 2022, The White House Office of Science and Technology Policy released a white paper outlining a potential framework for AI. This framework builds on five principles:
  1. Safe and effective systems
  2. Protection from algorithmic discrimination and inequitable systems
  3. Protection from abusive data practices and agency about how personal data is used.
  4. Knowledge of when an automated system is used and its impacts.
  5. The ability to opt out of automated systems for a human alternative.
“For each principle, the Blueprint lays out its importance and what should be expected of automated systems to conform to the principle and outlines how it can be put into practice through broad descriptions of examples.” (Hine & Floridi, 2023, p.2). As we seek to do no harm to others, we must work together to recognize and educate our students on the threats and unintended consequences associated with rapidly developing technology and how to interact with it in an ethical way.

“The possibility of anonymous performance and, essentially, the reduced probability of being discovered or punished does not dismiss us from moral responsibility and bad conscience” (Pivec, 2011, p. 65).


Do no harm to others’ property


The final pillar focuses on our ethical responsibility to uphold and respect the property of others. It is common for those in the field of education and training to have digital access to the sensitive data of our students and their families, colleagues, and employers. Educators need access to student Individualized Educational Program (IEP) and/or 504 (identified accommodation for disability) documents along with student and family personal information to complete their duties. Educational administrators must have access to personnel files and performance improvement plans to mentor and develop their staff. Instructional designers require access to often protected content, policies, and procedures to develop training within an organization. With near ubiquitous digitization of data, it is essential that we as practitioners understand and value our role in the responsible use and protection of others’ information to avoid causing inadvertent harm to others.

Educational technology is full of tools, applications, and processes for conducting research. To honor the work and minds of others, we have a responsibility to understand the rules and best practices for conducting research. By crediting original authors through proper citations, we foster an environment that encourages creativity, curiosity, and discovery. We must keep our ethical and integrity standards high and have the courage to hold those accountable who fail to meet an acceptable standard.

Copyright infringement, or the use of protected (copyrighted) content without acquiring permission, has become a much more common occurrence as the amount of content available to use across an unlimited number of platforms blurs the lines of what is legal. In addition, fair use copyright exceptions create grey areas as educational practitioners try to determine what is acceptable across different media. It is clear, however, that protecting the intellectual property of others is essential to encouraging the creation of new literary, artistic, musical, and dramatic works, which benefit us all.

Perhaps a practical solution for creators that respects others’ intellectual property, abides by copyright laws, and offers an ethical solution is accessing, referencing, and utilizing content licensed lives within the Creative Commons. Where copyright law defines and protects intellectual property, the Creative Commons, launched in 2002, “aims to revive, clarify, and expand fair use” (Garcelon, p.1310). Designed with four conditions (see figure 2) in mind, there are seven Creative Common licenses (figure 3) that creators may select for their work. As we seek to do no harm to other’s property within the field of educational technology, it is worth considering how the Creative Commons might be used to license and access educational resources and products in an ethical way.




Figure 2. Four Conditions of Creative Commons Licensing

A table follows. 
 

Seven Types of Creative Commons Licenses

(Apfelbaum & Stadler, 2021)

1.)

Attribution

(CC BY)

Users may copy and distribute original versions of a work and modify and distribute adapted versions with proper attribution to original work.

2.)

Attribution-NonCommercial

(CC BY-NC)

Users may copy and distribute original and adapted versions of a work in noncommercial settings and with proper attribution to original work.

3.)

Attribution – ShareAlike

(CC BY-SA)

Users may copy and distribute both original and modified versions of a work if resulting work is relicensed under a compatible ShareAlike license and with proper attribution to original work.

4.)

Attribution-NonCommercial-ShareAlike

(CC BY-NC-SA)

Users may copy and distribute original and modified versions of a work in noncommercial settings if relicensed under a compatible ShareAlike license and with proper attribution to original work.

5.)

Attribution-NoDerivatives

(CC BY-ND)

Users may copy and distribute only original versions of a work with proper attribution to original work.

6.)

Attribution-NonCommercial-NoDerivitaves

(CC BY-NC-ND)

User may copy and distribute only original versions of a work in noncommercial settings and with proper attribution to original work. 

7.)

Zero License

(CC0)

User may copy, distribute, and adapt work for commercial and/or noncommercial purposes without attribution to original work



Figure 3. Seven Types of Creative Commons Licenses
 

Conclusion

While this simple framework is not a panacea to address all ethical questions that educational technology presents, it serves as a common language across the vast range of professions required to make ethical decisions in educational technology daily. It creates an opportunity to consider how our individual actions benefit us and cause harm to ourselves, to our neighbors, and to society. Further research should be conducted to examine educational technology ethics through additional lenses that are inclusive and representative of the diverse population of users. This diversity expands our collective body of knowledge which better informs our practice. Avoiding harm to ourselves, to our property, to others, and to others’ property ensures that we maintain our humanity and interdependence in an increasingly complex digital world.

 

References

Apfelbaum, D. S., & Stadler, D. (2021). A crash course in Creative Commons licensing. Serials Review, 47(3/4), 122-125. https://doi-org.proxy.wichita.edu/10.1080/00987913.2021.1963634

Caines, A., (2022). Data ethics for the digital education age. US Official News.
link.gale.com/apps/doc/A699728375/STND?u=ksstate_wichita&sid=summon&xid=f3c4685d

Ceci, L., (2022). Average time spent daily on a smartphone in the United States 2021. Statista. https://www.statista.com/statistics/1224510/time-spent-per-day-on-smartphone-us/

Creative Commons. (2020). Creative commons for educators and librarians. eBook edition. ALA Editions. https://drive.google.com/file/d/1w2Kz8c7xpf-fRIqRvkUjqt9drSRl7MRG/view

DePriest, D. L., (2019). In times of great change: Reality, morality, and ethics on the internet.
C2C Digital Magazine. 1(10). https://scholarspace.jccc.edu/c2c_online/vol1/iss10/11

Garcelon, M., (2009). An information commons? Creative Commons and public access to cultural creations. New Media & Society. 11(8), 1307-1326. https://doi-org.proxy.wichita.edu/10.1177/1461444809343081

Growing VPN-market Highlights Importance of Network Infrastructure Providers in Safeguarding Internet Freedom. (2019). Database and Network Journal, 49(6), 15. https://link-gale-com.proxy.wichita.edu/apps/doc/A612030184/AONE?u=ksstate_wichita&sid=summon&xid=c9e2546d

Hine, E., Floridi, L., (2023). The Blueprint for an AI Bill of Rights: In Search of Enaction, at Risk of Inaction. Minds & Machines. https://doi-org.proxy.wichita.edu/10.1007/s11023-023-09625-1

IntelligentHQ. (2022). How do VPNs help with internet privacy and safety?. London: Newstex. Retrieved from https://proxy.wichita.edu/login?url=https://www.proquest.com/blogs-podcasts-websites/how-do-vpns-help-with-internet-privacy-safety/docview/2703954970/se-2

Kilcar, S., (2017). The pace of technological change is accelerating in trucking. Fleet Owner.
Business Insights: Essentials. http://bi.gale.com/proxy.wichita.edu/essentials/article/ GALE%7CA501107475?u=ksstate_wichita

Mahfood, S., Astuto, A., Olliges, R., & Suits, B. (2005). Cyberethics: Social ethics teaching in
educational technology programs. Communication Research Trends. 24(4), pp. 3-22.

Mancinelli, E., Ruocco, E., Napolitano, S., & Salcuni, S. (2022). A network analysis on self-harming and problematic smartphone use – the role of self-control, internalizing and externalizing problems in a sample of self-harming adolescents. Comprehensive Psychiatry, 112 doi:https://doi.org/10.1016/j.comppsych.2021.152285

Moor, J.H. (2005). Why we need better ethics for emerging technologies. Ethics and Information
Technology, 7: 111-119, DOI: 10.1007/s10676-006-0008-0.

Noor, A. K., (2005). Disruptions of progress: there’s no slowing the pace of technological change: engineering practice will have to adapt to keep up. Mechanical Engineering CIME. Web.http://bi.gale.com.proxy.wichita.edu/essentials/article/ GALE%7CA138996859?u=ksstate_wichita

Parsons, T., (2021). Ethics and educational technologies. Education Tech Research Dev. 69,
pp.335-338. https://doi.org/10/1007/s11423-020-09846-6

Patil, S., Varadarajan, V., Walimbe, D., Gulechha, S., Shenoy, S., Raina, A., & Kotecha, K. (2021). Improving the robustness of AI-based malware detection using adversarial machine learning. Algorithms, 14(10), 297. doi:https://doi.org/10.3390/a14100297

Pivec, F., (2011). Codes of ethics and codes of conduct for using ICT in education. Organizacija. 44(3). https://doi.org/10/2478/v10051-011-0007-8

Spector, M. J., (2016). Ethics in educational technology: towards a framework for ethical decision making in and for the discipline. Education Tech Research and Development  https://doi.org/10.1007/s11423-016-9483-0

Thornley, C., et al., (2021). Good to know: An exploration of the role and influence of professional ethics in ICT bodies of knowledge (BoKs). The Electronic Journal of Knowledge Management. 19(2), pp. 150-164, available online at www.ejkm.com

Trumbly, N. (2023). Chat GPT is a threat to informed society. University Wire Retrieved from https://proxy.wichita.edu/login?url=https://www.proquest.com/wire-feeds/chat-gpt-is-threat-informed-society-xa0/docview/2789813443/se-2

Vicenç, T., (2022). Guide to data privacy: Models, technologies, solutions. Springer. https://doi-org.proxy.wichita.edu/10.1007/978-3-031-12837-0 

Winchester, I., (2010). Education and the pace of technological change. The Journal of
Educational Thought.
44(3).

Yeaman, A., (2004). The origins of educational technology’s professional ethics: Part one. TechTrends. 48(6), pp.13-14)

Yeaman, A., (2005). The origins of educational technology’s professional ethics: Part two –establishing professional ethics in education.

Yeaman, A., (2011). A code for the profession. TechTrends. 55(3).


 

About the Authors




Gabriel Schott is a highly experienced and dedicated music educator, passionate about developing inclusive organizational cultures that cater to the needs of all learners. With over 10 years of experience in Learning and Development, Gabriel specializes in designing innovative and effective learning solutions that empower individuals and teams to achieve their goals.

Gabriel holds a Bachelor of Music Education degree from the University of Northern Iowa and is currently pursuing a master's degree in Learning and Instructional Design at Wichita State University.

As Co-Director of Bands at Pella High School in Pella, IA, Gabriel creates high-quality musical experiences for his 210-member high school band. He employs backwards design, differentiated instruction, and game-based learning to maximize learning outcomes at the individual, group, and full ensemble levels.

As the founder of Schott Learning Solutions, Gabriel is deeply committed to investing in the potential of individuals and organizations. He creates engaging learning experiences that promote inclusivity and ensure success for learners of all backgrounds and learning styles. Through the incorporation of gamification, storytelling, and real-life scenarios, Gabriel ensures that all learners can actively engage and thrive in their educational journey.

His email is gdschott@shockers.wichita.edu.







JaeHwan "Jay" Byun is an associate professor and the chair of the Master of Education in Learning and Instructional Design program in the School of Education of the College of Applied Studies at Wichita State University. He has been teaching courses related to instructional design and technology at WSU since 2015. Dr. Byun is interested in research topics, including learner engagement, online learning, learning analytics, and digital game-based learning. His career goal is to seek ways to create a learning environment where learners can learn through aesthetic learning experiences which are engaging, infused with meaning, and feel coherent and complete. He is currently involved in research about digital game-based learning for refugee students for their language learning through science concepts.

Dr. Byun may be reached at jaehwan.byun@wichita.edu.

This page has paths:

This page has tags:

This page references: