Y, Robot: Campus researchers are pushing the boundaries of robotics

Y, Robot: Campus researchers are pushing the boundaries of robotics

Automated undersea mapping, swarms of computer-driven drones, self-driving cars—20 years ago, such devices would hardly have been found outside the pages of an Isaac Asimov novel.

Today, they are the focus of many research laboratories in BYU’s Department of Electrical and Computer Engineering. Each lab is headed by a dedicated faculty member, who directs research and provides mentorship to undergraduate, graduate, and doctoral students. Campus-based projects use sophisticated hardware and software to solve challenging robotics problems.

The Daily Universe spent some time with two of these groups: the Intelligent Multi-Agent Coordination and Control Lab and the Robotic Vision Lab.

The two groups call the department’s information systems and science wing home. Although both laboratories address challenges related to machine automation and robotic control, the groups take different approaches and find different applications for their work.


Cami Peterson, PhD in space, is a faculty member who conducts research at the MAGIC Lab. She explained the “big picture” of much of her work and the lab’s work.

“Most of the algorithms we use are based on autonomous systems only,” Peterson said. “They could be robots underwater, over water, or on land.”

An algorithm is a complex mathematical formula that determines how computer systems – such as those in MAGICC’s robots – make independent decisions. MAGICC is developing formulas that control the movement of physical robots through space.

“A lot of what I do and teach is on the control side,” Peterson said. “How do you control the vehicles? How do you do path detection… How do you make sure they actually follow that path?”

Jaron Ellingson and Mason Peterson observe a self-flying drone in the Magic Lab’s prototype room. The drone’s task is to navigate along a directed trajectory. (Courtesy of MAGICC Lab)

Peterson alluded to the huge advances in self-driving vehicle technology — drones in particular — that have developed in the past few decades.

“Some of those early drones, even just a decade later, you were trying to fly them and they were almost uncontrollable,” Peterson said. “Any slight wind or movement will blow it off. Now you can basically take a drone and it’s stable enough that someone who hasn’t flown (a drone) before can go out and fly it.”

Peterson noted that advances in drone technology alone “open up the possibilities for how we can use them and make the world a better place.”

Jaron Ellingson, Ph.D. A student in the mechanical engineering department working at the MAGICC Lab, he hopes to take advantage of these developments to build a system of autonomous drone swarms that rely on a decentralized approach.

Ellingson explained that the drones use algorithms to estimate each other’s locations and adjust their flight paths accordingly.

He envisions companies like Amazon or UPS using this system to organize large fleets of autonomous drones. “They can broadcast their location…and other drones can take that position…and avoid each other.”

The MAGICC lab relies on custom computer code, designed by programmers and human engineers, to control the movement of autonomous vehicles. Algorithms are fine-tuned according to needs and challenges that researchers understand well. Ellingson and his squadron of drones are like a conductor and a symphony orchestra – Ellingson knows what he wants and gives the performers instructions to develop just the right sound.

Robotic Vision Lab

However, in another part of the Electrical and Computer Engineering department, students and faculty are practicing the computational equation of free jazz.

The Robotic Vision Lab focuses on using artificial intelligence and machine learning to realize vision in robotics. Their research includes self-driving cars, facial recognition, and food inspection.

Casey Sun, Ph.D. A student in the lab, demonstrating how Robotic Vision uses machine learning techniques to enable their projects. “You collect some clean data in a lab environment…and you try to fit the model to this clean data. The model will be able to learn some patterns in the clean data.”

To achieve robotic vision, the lab uses special computer programs, called neural networks, that can learn to recognize patterns by constantly comparing examples of “clean” data produced in the lab with real-world examples — the so-called “noisy data.”

Left to right: Josh Broekhuijsen, Shad Torrie, Andrew Sumsion, and DJ Lee show off the lab’s self-driving robotic cars. The Daily Universe spent some time with the Intelligent Multifactor Coordination and Control Lab and the Robotic Vision Lab to learn more about their robotics research. (Coleman numbers)

In essence, Sun explained, the neural network says, “I haven’t seen this pattern before, so I’m going to change (my model).”

Faculty member DJ Lee, who holds a Ph.D. in electrical engineering, leads various lab projects. He described how he hopes some of the projects in the lab will benefit campus members and global communities.

“Our facial motion authentication project can improve security and convenience for users. Our visual inspection automation projects improve food safety and food production efficiency. Both will have a significant impact on our daily lives,” Li said.

Looking towards the future

Researchers from both laboratories have expressed their vision of how robots can positively impact people’s lives.

“I hope our work on autonomy will lead to a world where people don’t need to perform repetitive, dangerous, or monotonous work, and can focus on more important and rewarding projects,” Adam Welker, an undergraduate student working in the MAGICC lab, said.

“I also believe that drones can connect us to places. Whether that is providing support to isolated rural areas or facilitating transportation in crowded cities, drones have great potential for improving our infrastructure,” said Welker.

Cami Peterson shared her fears as well as her hopes for robots. “Depends on the day. I’m always either an optimist or a skeptic,” she said.

She cited self-driving cars as an example of a technology that has achieved a lot but still has a long way to go. “It’s amazing what they’ve done, but it’s still quite a challenge. There really is no substitute for how smart humans are. That’s definitely something AI has helped me appreciate: things we don’t even think about are really complicated.”

Friendly print, PDF and email

#Robot #Campus #researchers #pushing #boundaries #robotics

Leave a Comment

Your email address will not be published. Required fields are marked *