8 February 2024

Girl Geek Dinner talk about SCSC

We’re aware that we take a break for Christmas but its time to get back to our lovely network of Women to show you what we are about. We have some great speakers showing an interest and will be looking for some sponsors who are willing to host, but this month (and we know its short notice) Women’s Tech Hub is going to host (and we’ll pay for pizzas).

We are going to be at the 32nd Safety Critical Safety Symposium and we know it’s expensive for people who want to come as individuals so have decided we are going to share with you our findings from it. You don’t have to be in the industry to come along as Girl Geek Dinners is for any of our members who want to learn more.

Just so you are aware – these are some of the best people in the world and they are set to discuss:

What are the risks of AI?

ChatGPT 4: “The risks associated with artificial intelligence (AI) are a topic of extensive discussion among technologists, ethicists, policymakers, and the general public.

Here are some of the primary concerns:

  • Job Displacement: AI and automation can lead to the displacement of jobs, as machines can perform some tasks more efficiently than humans. This can have significant socioeconomic implications, including increased unemployment and wage stagnation in affected industries.
  • Bias and Discrimination: AI systems can inherit biases present in their training data, leading to discriminatory practices. For instance, facial recognition software has been shown to have higher error rate for people of certain racial and ethnic groups.
  • Loss of Human Skills: With AI taking over tasks such as navigation or memory-dependent activities, there is a risk that humans may lose certain skills that are not used regularly, potentially reducing cognitive abilities over time.
  • Security Risks: AI can be used to develop sophisticated cyber-attacks, and AI systems themselves can be vulnerable to such attacks. The integration of AI into critical infrastructure heightens the potential impact of these risks.
  • Privacy Erosion: AI’s ability to analyze vast quantities of personal data can lead to erosion of privacy. For instance, AI can be used to make highly accurate predictions about individuals’ behaviors, preferences, and even future actions.
  • Control and Autonomy: As AI systems become more autonomous, there is a risk that they may act in unforeseen ways that are not aligned with human intentions or may be manipulated to act against human interests.
  • Ethical and Moral Considerations: There are significant ethical questions around AI, including the morality of decisions made by AI systems, particularly in life-and-death situations such as in autonomous vehicles or military applications.
  • Economic Inequality: The benefits of AI may accrue disproportionately to those who own the technology, potentially exacerbating economic inequality. Companies and nations that can invest in AI could gain significant economic advantages, leaving others behind.
  • Dependency: Over-reliance on AI can lead to a lack of preparedness when systems fail. If critical infrastructure or services are AI-dependent, outages or malfunctions could have severe consequences.
  • Manipulation and Fake Content: AI can be used to create deep-fakes and synthetic media, which can be used to manipulate public opinion, perpetrate fraud, or spread misinformation.
  • Existential Risk: In the long term, some theorists and scientists worry that the creation of super-intelligent AI could pose an existential risk to humanity if it becomes uncontrollable or its goals are misaligned with human values.
  • These risks require careful management and regulation, including the development of AI ethics guidelines, robust security measures, and ongoing research into AI’s broader societal impacts.”

So for all of you that can’t make it, other than the automotive side I’ll cover in the Girlgeekdinners talk all that I hear from this so you have some insight.

Women n Tech

Safety

Interested in what's happening in Women's Tech Hub?

Sign up to our newsletter
Skip to content