An interdisciplinary research effort to better understand human behavior’s effects on cybersecurity efforts.
Introduction
In a world of ever-changing technology that controls our day to day lives, security is rarely guaranteed. Originating from the twentieth century wave of crime that we see recreated in the newest Netflix original series, crime persists throughout our society in new and creative ways. Cybercrime and its subsequent counter parts present an active challenge to those attempting to protect the ignorant masses, and while the fight rages and evolves every day, new problems are created almost weekly. Cybersecurity, like the crime it works to protect us from, also continues to evolve to stand a chance against the increasing odds, but in some instances their focus revolves around the less obvious issues. Human behavior within the field of cybersecurity plays a vital role in both the survival of an environment as well as the potential downfall and if not properly monitored and managed, has the potential to bring down even the strongest of cybersecurity systems. For this exact reason the question has been penned of whether human behavior and human error can ever be mitigated within a cybersecurity environment, and whether the perception of those carrying out these attacks can be referenced. Throughout this paper, that topic will be explored using the disciplines of both psychology and criminology to gain a further understanding as to what the field of cybersecurity can do to keep up.
Discussion
In research situations, the question of whether to approach with an interdisciplinary mindset is often presented with confusion. Personally, an interdisciplinary mindset, even one not initially realized, is vital in most research environments due to the deeper understanding different fields are able to offer on the question being asked. To explore human behavior and its effects on cybersecurity one must view two different groups of people so that this understanding can be achieved. The first of which is the user and the psychology of such. How they perceive technology, whether ignorant or otherwise to change and necessary risk, how they use technology whether willingly or with negligence. These topics are important in finding vulnerabilities within a cybersecurity system, vulnerabilities that an attacker can take advantage of. The attacker is the second group of people that must be explored by using the discipline of criminology. As criminology details itself, it is the study of crime and why criminals carry out the crime that has been committed. This works well within the interdisciplinary established environment in that we can reference the criminology field to view how an attacker will perceive a vulnerability, especially one created by human error. These two disciplines should then be referenced to conclude on how cybersecurity can best mitigate human error, especially in how to avoid human created vulnerabilities.
Discipline #1
Rachid Ait Maalem Lahcen in his 2020 journal titled “Review and Insight on Behavioral Aspects of Cybersecurity” said “It is easier to blame the human during a cyber incident instead of blaming the cyber program or the design of the systems. In fact, the system design that did not consider the human factor is also to blame” (Lahcen, et al. 2020) regarding how cybersecurity systems interact with domains. Domains in cybersecurity present a representation in how the system interacts with each part of a system. Of these, the weakest link lies within the user domain, the domain that represents the person controlling the system using physical commands and operations. What Lahcen et al. is highlighting in his journal is just how vital a role human error plays in determining the outcome of a cybersecurity system, and just how dangerous the human behavior factor can be. Cybersecurity while by design serves to protect against cyber related threats, it also as a field seeks to be rid an environment of all vulnerabilities, including those created by human behavior. Shari Lawrence Pfleeger supports this claim about human behavior in response to cybersecurity systems in her 2012 journal on mitigating risk based on behavioral science by saying “Problems of appropriate response to cyber incidents are exacerbated when security technology is perceived as an obstacle to the user. The user may be overwhelmed by difficulties in security implementation, or may mistrust, misinterpret, or override the security” (Pfleeger, et al. 2012). In summary, while the field of cybersecurity works to build systems to combat cyber related threats, the field also must cover vulnerabilities that exist within a physical setting, including those that are created by human error or in response to human behavior.
Discipline #2
Psychology plays into this by working to discover the deeper meaning behind those human behavior related issues that the field of cybersecurity is managing. While this can stem from a multitude of backgrounds or pre-existing issues, Pfleeger’s comments on trust is something that should be further considered. A user’s perception of the system that they are using is the most important thing to be considered. How they view the system, whether they trust it completely, whether they take proper precautions, etc. If a user is blatantly ignorant about what needs to be done to protect or maintain their system, or they are simply just negligent of the system maintenance in general, errors can be created, and vulnerabilities can develop. The lack of trust that has been hinted at is further explored in Kami Vaniea and Yasmeen Rashidi’s 2016 case study on the user’s perception of receiving a software update where they said in response to a large portion ignoring simple maintenance “People are also hesitant to apply updates because they are annoyed or confused about the update message that they received (Vaniea, Rashidi, 2016). This lack of awareness in how important something like an update is to a system can best be summarized by comparing the computer in question to a new car that one has received. If the car is driven routinely for a long period of time, yet the driver has chosen to ignore maintenance like oil changes, the car’s condition will deteriorate, and the performance will decline. The same can be said for even the most basic computer systems, if routine maintenance is ignored either due to ignorance or negligence, problems can be created that have the potential to develop into further, more severe problems.
Discipline #3
The second group of people that must be explored, and the third discipline that must be focused on to form common ground amongst the disciplines is criminology, or how the attacker perceives the target. In specifics, I will be observing how the attacker perceives the vulnerability as well as the person that created it. Referring to the comment about how the user is the weakest link throughout the system, attackers will often immediately challenge that domain to discover or take advantage of a vulnerability. One of these techniques used is referred to as social engineering in which the attacker will target weaker links within the social hierarchy to exploit what Jason Nurse described as “…a willingness to trust others and to be kind, the impact of anxiety and stress on decision making, personal needs and wants, and in some regards, the naivety in decision making” (Nurse, 2018). While the attack is not unique to cybersecurity, it is especially prevalent within the field due to the large-scale level of ignorance that was mentioned in previous sections throughout the user domain. Users will often be clueless to the attack and without proper training or knowledge will unknowingly give up the keys that these attackers need to carry out their attack. On the topic of understanding risk and the level of importance of technological literacy, Pfleeger commented on the perception of risk by saying “…they underestimate the risk, users may think they are immune to cyberattacks, even when others have been shown to be susceptible” (Pfleeger, et al. 2012).
Applications and Common Ground
The initial development of common ground that can be discovered is between criminology and psychology in how the user perceives risk. Risk within the field of cybersecurity is vital in understanding due to the abstract nature of the term itself. Risk refers to what you can lose or what is able to be undone, and without a proper understanding of it, that type of negligence is able to be exploited. Attackers will often target the ones that avoid recognition of risk due to their ignorance of what they have the potential to lose, like Pfleeger said, “they think they are immune.” With this common ground of how the two groups of people perceive risk, one would assume that the way to navigate human error within this environment would be to classify the risk or better educate the users themselves. This however was explored by Pfleeger in discovering the solution in which they compared the ideas of the users to cognitive dissonance. She summarized this idea by saying, “Cognitive dissonance is central to many forms of persuasion to change beliefs, values, attitudes, and behaviors. To get users to change their cyber behavior, we can first change their attitudes about cyber security” (Pfleeger, et al. 2012). This wraps back to the idea of changing the user’s perspective, something that Vania and Rashidi were exploring with the user’s opinion on software updates. It is something that is hard to do however due to the level of ignorance and mistrust that is represented within the field.
Another area of common ground that can be reached is between cybersecurity and its technical aspects with the field of criminology in determining what is going to be targeted. While the understanding of who is being targeted is present, the perception of what data could be of interest is something that many cybersecurity experts are continuously looking out for. Lahcen et al. made note of this by describing the data as an opportunity within the eyes of an attacker, they noted, “Applying described theories to cyber domains should help to identify targets by understanding opportunities of a crime. This can be a subject of asset management and risk assessment. What are the crown jewels? And what are their vulnerabilities?” (Lahcen, et al. 2020). By understanding the “crown jewels” (data) in question the first vulnerability one should consider is who is guarding that data, whether knowingly or not. This in my opinion is a critical error within cybersecurity as in some situations the data that is supposedly held at this high level is often left with those who do not understand the weight of the data or simply do not understand how to keep it safe from attackers. The unfortunate matter is that the data that is often considered most vital or most likely to be targeted during an attack is data that can be accessed from each individual system due to their connection to the server. This links back to the understanding of how the weakest link is often the person that is the most ignorant towards the possibility of an attack.
Conclusion
Due to these three disciplines and the perceptions that they can provide on the issue in question, one could assume that the initial step in resolving these vulnerabilities is to provide better training, the issue with this however rests Pfleeger’s comment on cognitive dissonance. The issue rests not on the technological training of the user but of their understanding of the system and its operations. Their skewed perceptions and inability to offer an understanding is what creates the vulnerabilities, not simply their inability to use the tech. This issue within the cybersecurity side of things exists not because of anything having to do with cybersecurity but in the psychology of the user and how their perception of tech or their mistrust in what they are using allows them to feign ignorance. The connection of human behavior that must be solved isn’t the connection from cybersecurity to psychology but the connection between criminology to psychology. The further advancement of these systems to combat human error and behavior is to explore who is being targeted and why so that those in charge can work out the weak links that are most likely to cause errors. In the end, users will continue to present errors and those errors will have equal opportunity to further develop into vulnerabilities that attackers are able to exploit. However, the resolving of human behavior related vulnerabilities resulting from negligence about the technology that is being used is vital for the survival of a company.
References
Maalem Lahcen, R. A., Caulkins, B., Mohapatra, R., & Kumar, M. (2020). Review and insight on the behavioral aspects of cybersecurity. Cybersecurity, 3(1). https://doi.org/10.1186/s42400-020-00050-w
Nurse, J. R. (2018). Cybercrime and you: How criminals attack and the human factors that they seek to exploit. The Oxford Handbook of Cyberpsychology, 662–690. https://doi.org/10.1093/oxfordhb/9780198812746.013.35
Pfleeger, S. L., & Caputo, D. D. (2012). Leveraging behavioral science to mitigate cyber security risk. Computers & Security, 31(4), 597–611. https://doi.org/10.1016/j.cose.2011.12.010
Vaniea, K., & Rashidi, Y. (2016). Tales of software updates. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/2858036.2858303