skip to Main Content

AI is Disrupting Education … But It’s Not Out for Our Jobs

November 14, 2018

Michelle Dickinson, a faculty member at the University of Auckland in New Zealand and nanotechnologist, loves to teach welding. But it’s expensive. You need a special lab that only holds about 20 engineering students at a time, and that’s not an efficient way to teach.

Through the magic of augmented reality (AR), now her students learn welding at home. All they need is some simulation software. They’ll wear a visor with sensors and displays and use robotic equipment that simulates the experience with 3D imagery reacting to student actions. Systems like these can allow instructors to follow student progress in real time. [Read: Welding Simulator Uses Augmented Reality to Teach Students Safely]

Dickinson said this technology can actually teach welding better than she can, because it’s more efficient. Students could spend all weekend learning to weld if they really wanted to versus just the 50 minutes in her class. Doesn’t sound like the best job security, right? Wrong. The founder and CTO of Nanogirl isn’t threatened by this new way of teaching. She knows that it’s never going to fully replace her job – it’ll only supplement and enhance her classroom, if she lets it. This message reigned over EDUCAUSE 2018 earlier this month in Denver, Colo.

Higher education professionals discussed today’s toughest technology issues facing campuses globally, and this idea emerged throughout the conference: Technologies like augmented and virtual reality and artificial intelligence (AI) are here to stay, but you don’t have to be scared of them. They’re not going to replace jobs. Rather, jobs and the way we educate tomorrow’s workforce simply need to evolve. We can work alongside these sophisticated technologies to help us do our jobs better.

“AI is going to supplement work we already do. Think of tools for AI as having your own jazz band. A jazz band riffs on a melody. One instrument starts and then another jumps in,” said Jennifer Sparrow, Senior Director of Teaching & Learning with Technology at Penn State. “Embrace your skepticism around these things.”

Sure, machines can perform some tasks better than you – that doesn’t mean you’re no longer needed.

According to an EDUCAUSE Review article, Smart Machines and Human Expertise: Challenges for Higher Education: “AI and robotics have catalyzed a wave of automation … that will touch virtually all jobs, from manual labor to knowledge work. However, automation may be a less apt term than augmentation.

For example, in the legal world, a software created by Ross Intelligence uses IBM Watson AI technology that reads through thousands of cases almost instantly, more than a lawyer could read in his or her lifetime. [New York Times: AI is Doing Legal Work. But It Won’t Replace Lawyers, Yet.] But that software can’t try a case.

And the company Babylon built a trained AI doctor, i.e. a chatbot, to provide medical advice to patients (right now in a test environment). Using facial recognition software during video calls, the chatbot asks people automated questions, and everything is transcribed and categorized. Based on the movements of facial muscles, it determines if the patient is confused, worried or neutral and suggests some diagnoses to a human doctor, who can then review and prescribe medicine. The chatbot can read millions of medical journals and make an informed recommendation. The idea is that it will provide accessible healthcare to everyone and free up doctors’ time from note-taking and diagnosing to looking for more complicated problems. [Forbes: This AI Just Beat Human Doctors On A Clinical Exam]

AI, VR, AR, Machine Learning … Whatever You Call It… It Has Limits

Teddy Benson, Director of Data Integration at Walt Disney World, said in his EDUCAUSE keynote, “The Bias Truth of AI Models,” we need to remember that these advanced technologies aren’t fool-proof. He referenced a chatbot, Tay, released by Microsoft on Twitter in 2016 that went from saying “Hello World” to inflammatory, racist tweets less than 24 hours after its launch. Tay learned the behavior from online trolls who attacked her, and she attacked back.

“She was an infant AI,” Benson said, adding that we need to make sure the AI engine environment we use matches our goals.

We need to train the smart technologies on how and what to respond to by giving it guidelines and pre-empt certain behaviors. For example, creating canned responses like “I don’t like talking about that,” for Tay anytime someone typed a trigger word would have pre-empted much of the trouble.

So, what does that mean for higher education?

It all circles back to Michelle Dickinson, the welding instructor who is letting these sophisticated technologies work alongside her. She’s embracing it.

“It’s not about the technology. It’s about the strategy,” said Malcolm Brown, Director of Learning Initiatives at EDUCAUSE during the conference. “It’s what the tech enables. It’s about working together across campus.”

Today’s digital native students still crave a traditional developmental college experience, but instructors like Dickinson and educational leaders need to figure out what that looks like in a fast-paced world where students will be working alongside smart machines.

According to The 2018 Campus Computing Survey presented at EDUCAUSE by Casey Green, early data shows more initial interest in emerging technologies like AI for analytics than instruction. His survey found:

  • 42 percent of campus IT leaders believe AI will be an “important resource for analytics in the coming years.” That’s up from 30 percent in 2017.
  • Just less than 30 percent said that AI will play an important role in instruction in the next few years, up from 19 percent in 2017.

“The difference in the numbers between analytics/managerial deployment and instructional applications are not surprising. … AI functions will be imbedded into the managerial software routinely used by campus administrators. In contrast, the use of AI and AR/VR in instruction will depend on the decisions of individual faculty and academic departments,” Green said.

Using emerging technologies like these greatly enhances education and personalizes the flow of information. There’s massive potential in higher education to offer Generation Z the Netflix model for learning, à la using AI to suggest relevant videos and build personalized playlists. The more you use Netflix, the smarter it gets about personal preferences, making informed decisions about what you should watch. The future of learning will consider student preferences like how and when they want to learn and on what device. [Read: Survey: As New Generation Enters College, Artificial Intelligence Offers Potential for Netflix Model for Learning]

Campus leaders need to support the people at their institutions who can educate faculty about the value of these technologies, Brown said. The tech needs to be relevant and valuable to faculty … not to mention easy to use. That’s a lot easier said than done.

But Brown said we’re at the stage between disruption and transformation. And there’s no turning back.

“As we transform into this digital age,” he said, “how do we make it as convenient as possible so (students) can be truly digital?”

Read more about AI and Education


Back To Top