6 Strategies for Educating the AI Workforce
  • Home
  • 6 Strategies for Educating the AI Workforce

6 Strategies for Educating the AI Workforce

Ultimately, it’s not just about teaching students how to “do AI.” Unlike civil engineering, where learning the mechanics of bridge building is relatively straightforward, or aerospace, where the laws of gravity and propulsion are well understood, AI is far more complex.

In fact, there are significant societal, moral, ethical and even legal ramifications related to AI, and those must be part of a robust AI education. Some of those lessons can’t be taught in the classroom; they’re best learned in real-world settings.

To that end, here are six strategies that higher education should consider using to prepare students for an exciting future with AI.

1. Incorporate AI into Existing Degree Programs

I don’t see a need for specific AI engineering degrees. That is because I’ve never seen a university offer a degree in smartphone engineering or electric vehicle development. Still, we have seen the evolution of faster chips, responsive touch screens, advanced batteries and powerful engines to enable those applications.

Most of the underlying curriculum fundamentals of AI degree programs are virtually identical. Most focus on the foundational science: computer engineering, coding, advanced mathematics, data regressions, etc. Engineering and physics won’t fundamentally change, but what should change are the examples and homework assignments, which must include AI.

For colleges to attract a next-generation AI workforce, they must consider packaging their programs with an AI focus. Otherwise, prospective students may not see the connection.

RELATED: AI could impact higher ed student success in these ways.

2. Don’t Ignore AI’s Limitations

Because there’s so much buzz around AI’s potential, its limitations are sometimes overlooked. This can be problematic for idealistic young minds who are accustomed to trusting tech.

At its core, AI operates on pattern detection. It can detect patterns in large data sets with amazing speed and accuracy, but the outcome is limited by the breadth of the data used for training. It struggles to overcome the unpredictability of humans.

AI is helpful for making decisions, but it’s not always reliable, so students need to be made keenly aware of its limitations to make the most of its benefits.

3. Raise Bias Awareness

AI-based analytics tools don’t examine data and issue predictions on their own. They must be trained by humans to look at the right data through the correct lens. Even the most objective, impartial and equitable individual carries some level of unconscious cognitive bias.

Bias is all around us and can influence the outcomes of AI. The key is that bias is not necessarily negative; data can be influenced by bias intentionally or unintentionally. For example, an analyst at an international company might want to look at sales data to help predict future needs. Perhaps the analyst wants to bias the data to look only at one region or only one season. University programs must include education about bias and how it can influence outcomes.

 

Original Post: Read More

Source: EdTech Magazine: Higher Ed

Write a comment