{"id":29,"date":"2021-08-13T21:43:03","date_gmt":"2021-08-13T21:43:03","guid":{"rendered":"https:\/\/sites.wp.odu.edu\/oducleaders\/?page_id=29"},"modified":"2025-05-03T00:25:37","modified_gmt":"2025-05-03T00:25:37","slug":"diversity","status":"publish","type":"page","link":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/diversity\/","title":{"rendered":"Diversity"},"content":{"rendered":"<p>Description of the Course<\/p>\n<p>PHIL 355E gave me a structured space to explore the ethical and cultural implications of cybersecurity, privacy, and emerging technologies like AI. It wasn\u2019t just about technical systems\u2014it was about people. The course helped me look at how different communities experience surveillance, bias, and digital risk differently based on race, location, access, and history.<\/p>\n<p>As someone raised across two very different cultures\u2014Nigeria and the United States\u2014this course gave language to things I had felt intuitively: the gaps in global access, the power dynamics in tech development, and the ethical weight of the systems we build.<\/p>\n<p>\u2e3b<\/p>\n<p>Work Samples<br \/>\n\u2022 Final Paper \u2013 \u201cFree Speech vs. Deepfakes: Who Gets Protected in the Age of AI?\u201d<br \/>\nThis paper explored the tension between First Amendment protections and the real-world harm caused by generative AI. It looked at who is most likely to be targeted or misrepresented and how policy could better protect marginalized voices.<br \/>\n[Upload as PDF or embed viewer]<br \/>\n\u2022 Discussion Posts &amp; Weekly Reflections<br \/>\nI consistently reflected on how ethical theories connect to real-world tech use, and how cultural values shape what we see as \u201cnormal\u201d online.<br \/>\n[Optional: screenshot excerpts or combine into one media PDF]<\/p>\n<p>\u2e3b<\/p>\n<p>Reflection<\/p>\n<p>This course helped me step back from the code and think more deeply about why we build what we build\u2014and who it impacts. One of the most important takeaways was realizing that technology is never neutral. Even in cybersecurity, the policies we write, the models we train, and the systems we secure all reflect cultural priorities and biases.<\/p>\n<p>I saw how my own identity shapes how I view systems\u2014coming from a culture where privacy is often communal, then shifting into a society where data ownership is highly individual. This tension helped me understand how to adapt my communication, challenge assumptions, and approach tech design with more cultural awareness.<\/p>\n<p>\u2e3b<\/p>\n<p>Skills Developed<\/p>\n<p>Through this course, I developed several employer-valued skills:<br \/>\n\u2022 Obtain and process information: Evaluated ethical frameworks across cultural contexts to support policy suggestions in AI and cybersecurity.<br \/>\n\u2022 Communicate verbally and in writing: Participated in class discussions and wrote reflection papers that translated abstract theory into applied tech ethics.<br \/>\n\u2022 Make decisions and solve problems: Tackled real-world dilemmas like misinformation, facial recognition bias, and AI impersonation with structured ethical reasoning.<\/p>\n<p>\u2e3b<\/p>\n<p>Relevance to My Goals<\/p>\n<p>My long-term goal is to lead within cybersecurity GRC or AI governance. This course reinforced that cultural context and ethical reflection are just as important as technical knowledge when shaping policy or protecting users. Whether I\u2019m working in cloud compliance or consulting on AI risk, the ability to hold space for different perspectives\u2014and design around them\u2014is a core leadership skill.<\/p>\n<p>This experience also pushed me to reflect more honestly on how my own background impacts how I lead, how I build, and how I connect with people in the tech world and beyond.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Description of the Course PHIL 355E gave me a structured space to explore the ethical and cultural implications of cybersecurity, privacy, and emerging technologies like AI. It wasn\u2019t just about technical systems\u2014it was about people. The course helped me look at how different communities experience surveillance, bias, and digital risk differently based on race, location, &hellip; <a href=\"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/diversity\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Diversity<\/span><\/a><\/p>\n","protected":false},"author":21125,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"_links":{"self":[{"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/pages\/29"}],"collection":[{"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/users\/21125"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/comments?post=29"}],"version-history":[{"count":3,"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/pages\/29\/revisions"}],"predecessor-version":[{"id":201,"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/pages\/29\/revisions\/201"}],"wp:attachment":[{"href":"https:\/\/sites.wp.odu.edu\/mrabi-eportfolio\/wp-json\/wp\/v2\/media?parent=29"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}