Students in last year's drone class flying one of their projects. Photo by Wes Evard
For seniors Bart Janczuk and Daniel Guerra and junior Tim Blazek, their drone coding class has turned into a high-tech game of “Simon Says.”
At least, that’s what they’re hoping they’ll be able to get their drone to do when they test-fly it in late April. Blazek smiles at the thought.
“It’s one of the more interesting classes offered here,” he says.
CSE 40773, Software Development for Unmanned Aircraft Systems, is taught by Professor Jane Cleland-Huang, who taught the class for the third time this semester. She brought the class with her from DePaul University in Chicago, where she originally developed it for graduate students. The first half of the semester is spent learning the programming language for working with drones, while the second half is spent on a group project involving the contraptions, with only one requirement: the drones have to fly.
The openness of the assignment means Cleland-Huang sees some interesting projects. This year, students are looking at everything from formation flying and AI learning to drone relays, voice recognition and drone security.
Last year, two students developed technology that allows dispatchers to launch a drone carrying a defibrillator. It was so successful that the pair now has an IDEA Center-backed startup, DeLive, to respond to interest in the project from fire and police departments both near and far.
On a rainy, windy day in April, the current semester’s class met one last time in the lab before flying their drones. The day was meant for group work, with Cleland-Huang floating between tables to answer questions, dealing with software problems, hardware problems, and the problems that arise when you attempt to combine software and hardware. The 18 students divided themselves into five teams, all flying either Intel Aeros or Iris drones for the project.
The lab on the second floor of Stinson-Remick Hall smells like construction: the 3D printers thrumming away in the back of the classroom for a robotics course, students from a different class working on airplane wings using balsa wood, other mechanical items scattered across the wood-topped tables. Cleland-Huang starts her day in the room by wiping down tables that still have sawdust strewn across them, an errant screw or roll of tape also sometimes lying around. This is a frequently used classroom for those engineering classes which invoke some elbow grease — which is rarely classes from the Computer Science & Engineering track. Cleland-Huang says her students like the fact that this class involves hardware. It involves physics. Sometimes it involves consultation with electrical engineering professors.
“To our students, this feels real,” she says, gesturing to a drone. “You’re not just writing code; you’re watching your code fly.”
But the hardest part is the marriage between the two worlds, uniting software and hardware inside the drone so it can function autonomously.
“That’s the thing that students find the most challenging,” Cleland-Huang says. “When you’re uniting code with the hardware, you can end up with some problems. You need a higher level of perseverance with this.”
Computer science and engineering students take a different approach to the hardware/software question than other engineers, the professor explains.
“Our students are looking at more holistic projects,” she says. “When we deploy our drones, we’re thinking about them in society — even if our society is just White Field [the stretch of land north of campus where the students practice flying]. They have to think about the safety and ethics of their drones and projects.”
Ethics in drones is a growing question; drones were recently used to locate a missing man in Florida, help stop the Notre-Dame de Paris Cathedral fire, and fire “seed missiles” to help repopulate forests in Myanmar. But drones also play a role in long-range missile strikes, raise privacy concerns for both celebrities and private citizens, and can cause safety issues for both individuals and aircraft, as the December incident at London’s Gatwick Airport prominently displayed.
Expanding beyond drones, the skills the students develop in this class are showing up in more and more types of technology, all with similar ethical concerns.
“Society is becoming much more ubiquitous with embedded software systems — hardware that has software that controls it,” Cleland-Huang explains. “Drones just have that with the addition of flying.”
The professor says she’s happy to teach the class as project-based, and hopes her students take not just technical skills, but ethical lessons from her class.
“Instead of just learning the theory, they’re learning skills closer to what an actual industry job would be,” she says.
The AED-equipped drone from last year's class. Photo by Wes Evard
Seniors Sam Berning, Justin Garrard and Taylor Murray have found out in this semester that the scope of a project matters; they’re the team working on drone security and the ability to ground a drone that enters a specific space, such as protected airspace or a secured facility.
“It’s been a real challenge,” Berning says. “We’re looking at how to protect airspace, and we’re trying to access the security of recreational drones.”
While taking a screwdriver to the body of one of the class drones, Garrard told me that he liked the idea of finally working with his hands in a computer science class.
“The original test case comes out of this class,” he says. “We fly right next to the South Bend airport. We wanted to build a device that would passively protect airspace.”
They’ve since discovered that recreational drones are pretty secure, and interrupting the signal between drone and controller isn’t that easy. Murray says they aren’t the first team that’s attempted this subject, but that they’ve gone the farthest in the research, according to Cleland-Huang.
“I wanted to do something that I was interested in,” Berning says about taking the course. “It’s really rewarding to see your code acting in a physical system.”
“I’ve always been interested in drones and the AI aspect of things,” Murray adds, smiling. “It’s taken a lot more physics and electrical engineering than I thought it would. We think our project would take a lot of resources and funding — but we haven’t found out that it’s impossible.”
Amanda Gray is digital editor at Notre Dame Law School and is pursuing her master’s degree in library science at Indiana University. She previously wrote for the South Bend Tribune. Find her on Twitter @TheAmandaGray.