IT/CYSE 200T

Cybersecurity, Technology, and Society

This section of my e-portfolio features coursework from IT/CYSE 200T. The work included here reflects my exploration of cybersecurity through social, ethical, and organizational perspectives, with attention to how people and technology influence security outcomes.

 

 

Write Up (M6) – The Human Factor in Cybersecurity
Date: April 16, 2026
Course: IT/CYSE 200T
Format: Memo

 

Alicia Satchell 
CYSE 200T
April 16, 2026
Professor Hiser



BLUF: If I were a Chief Officer of Information working with a limited budget, I would allocate the money between employee training and cybersecurity technology, with a slightly greater focus on training employees in relation to funding additional technology. Technical defenses are necessary, but they only go so far because cybersecurity is not just a technical issue. Most cyber threats and attacks are connected to exploited human behavior, workplace behavior, insider access, poor judgment, trust, and lack of awareness. The readings from this week highlight how easy manipulation of people leads to effective cyber attacks, like phishing and social engineering. Since employees can either strengthen or weaken an organization’s security, I believe that the best use of a company’s budget should be invested in training, while technology acts as the second line of defense against attacks and prevention. 

One of the main reasons I’d put more funding towards training is that many cyberattacks are successful because people are easily fooled, not because the systems are weak. Think of it like this: when technology makes a mistake, the system learns and is less likely to make a mistake in the future related to cyber attacks. Humans are a bit more complex than computers and are more risky in their decision-making, and are easily tricked. Phishing attacks, for example, work by making a person believe that a message or email is trustworthy. Social engineering works similarly by exploiting people’s emotions in their weakest moments. Whether it be fear, confusion, urgency, or trust. This means that even if a company has a strong technical system, it can still be vulnerable if employees are not properly trained on how to recognize suspicious behavior. Workers who understand these attacks occur and the role they play in them are encouraged to pause and think critically in the moment about their choices that could cause harm to the organization. 

Likewise, another reason that I’d prioritize training is that cybercrime is often connected to workplace behavior and insider access. Payne explains in his paper that because of the advances in technology, crime can exist both in and outside of the workplace. Thus, creating an overlap of crime and white-collar crime. He highlights that cyber offenses can involve employee access, occupational roles, and organizational trust. The reason this matters is that it shows that attacks don’t just come from hackers. It can also come from people within an organization. It’s based on how they respond to systems, how they follow good security practices, and how they use their access inside the organization. Thus, training employees is one of the most practical ways to reduce risk before a threat can turn into a larger incident. 

In conclusion, I would allocate funding between employee training and technology as a 60/40 split, with 60% going towards training and 40% going to technology. I believe this is the best choice because the human factor plays such a large role in cybersecurity. These readings show that many threats succeed by manipulating people and taking advantage of workplace access, trust, and poor decision-making. Technology is an important part of cybersecurity, but if employees are unprepared, it cannot protect an organization and will lead to greater costly consequences for the organization down the line. A balanced budget that gives a slight edge to training would be the most effective way to reduce cyber risk. 

References

Payne, B. K. (2018). White-collar cybercrime: White-collar crime, cybercrime, or both? Criminology, Criminal Justice, Law & Society, 19(3), 16–32.

Oesterraas, I. (n.d.). Criminal justice and cybersecurity. In Cybersecurity, technology & society.

 






DISCUSSION BOARD (M7): Opportunities for Workplace Deviance

Date: April 21, 2026

Prompt: How has cyber technology created opportunities for workplace deviance?

Cyber technology has expanded the capabilities and harm of workplace deviance. It has created opportunities for more harmful behavior to be easier, faster, and harder to detect, and sometimes easier to justify. Before modern technology, deviance in the workplace required direct physical access, but today, employees can misuse systems, data, and communication tools from anywhere. For example, an employee can copy huge amounts of data within seconds, fake records, bypass procedures, or harass other coworkers via email. Cyber technology reduces the physical effort and time needed to engage in deviant acts. 

Another reason cyber technology increases workplace deviance is that it creates a sense of distance from consequences. There is less guilt surrounding indiviudals when it comes to sending rude messages, excluding coworkers from online communication, or spreading rumors via email. These practices can lead to digital harassment, bullying, and other harmful behaviors, which then negatively affect the company culture and create a negative working environment. Technology also seems to make it easier to hide deviance in the workplace. Employees may believe that deleted messages, private accounts, or remote access make their actions difficult to detect, which then encourages or tempts individuals to engage in rule-breaking behaviors. 

In addition, remote and hybrid work environments can weaken direct supervision and reduce the social pressure to behave accordingly. When employees work behind screens instead of shared office environments, the accountability it brings begins to decrease. As a result, misconduct in the workplace begins to increase with the misuse of time, reduced effort, and hostile digital behavior. Overall, cyber technology is not the direct cause of workplace deviance, but it has amplified it and created new opportunities for deviance to occur. 


DISCUSSION BOARD (M3): Ethical Considerations of CRISPR Gene Editing

Date: March 13, 2026

Prompt: Based on your reading assignments related to the BioCybersecurity section of this course, identify possible ethical considerations and explain your position.

  • Malicious Code Written into DNA Infects the Computer that Reads it
  • Hacking Humans: Protecting Our DNA from Cybercriminals 

Advances in biotechnology and digital technology have created new possibilities but have also introduced serious ethical concerns. The articles “Hacking Human: Protecting Our DNA from Cybercriminals” by Juliette Rizkallah and “Malicious Code Written into DNA Infects the Computer that Reads It” by Devin Coldwey highlight how DNA sequencing technology and genetic information are being digitized, and new vulnerabilities are emerging that could be exploited by cybercriminals. These situations raise important questions about privacy, security, and the responsibilities of companies and researchers to protect sensitive biological information. 

One major ethical concern discussed in Rizkallah’s article is the privacy and protection of genetic data. DNA contains highly personal information about a person’s health, ancestry, and biological relationships. If hackers stole this data, it could be misused for identity theft, discrimination, or other harmful purposes. Unlike passwords or financial information, genetic data cannot be changed once it has been exposed. This makes protecting DNA data especially important and raises questions about whether companies that collect genetic information are doing enough to safeguard it. 

Coldwey’s article introduces a different but related ethical concern involving the intersection of biotechnology and cybersecurity. The research described shows that malicious code could be written into synthetic DNA and potentially infect a computer system when the DNA is sequenced and processed by software. Even though the experiment was done to demonstrate a possible vulnerability, it still raises concerns about how scientific discoveries could be misused in the future. As technology continues to develop, researchers and organizations need to think about how these systems can be secured so that scientific advancements are not turned into new opportunities for cyberattacks. 


DISCUSSION BOARD (M7): The “Short Arm” of Predictive Knowledge
Date: April 21, 2026

Prompt: From this week’s Jonas Reading: How should we approach the development of cyber-policy and infrastructure, given the “short arm” of predictive knowledge?

From reading, the argument made for approaching the development of cyber-policy and infrastructure was to approach it with caution, humility, and a strong sense of responsibility for long-term consequences. Jonas explains that in the past, older forms of ethics fit a world where human action had limitations and immediate effects. Hence, the reason for the “short arm” of predictive knowledge is that it didn’t require a longer reach. Now we live in a world where human action can affect people across borders, in offices, or in home settings, because of modern technology. The effects now are often collective, cumulative, and irreversible. Jonas puts it like this: predictive knowledge now falls behind technical power, which means ignorance can no longer serve as an excuse

When it comes to how this applies to cyber-policy, it means that systems cannot be built only for efficiency or convenience. We must also ask what choices will shape privacy, freedom, trust, autonomy, and vulnerability years from now. I think the cyber infrastructure should be designed more with a focus on precaution. This way it minimizes unnecessary data collection, builds in accountability, tests for unintended harms, and creates preventive measures before damage can occur. It seems Jonas wants us to understand that morality is critical in the realm of public policy because the decisions made in this process can have lasting consequences for society on a much larger scale.