Projects
A more complete list of my projects can be seen in my CV here.
Learning Task-Oriented Grasps from Limited Labeled Data
This work is currently under review at CoRL 2022. We propose a deep-learning-based method for detecting task-oriented grasps that leverages a pre-trained general grasp quality network branch to efficiently generalize to a new task within 10 to 20 training examples. A pre-print is available upon request.
Learning to Detect Multi-Modal Grasps for Dexterous Grasping in Dense Clutter
I presented this paper at IROS 2021. See the blog post about it here and the code here.
Drake Controller & Model: Robotiq 3-Finger Adaptive Gripper
My implementation of a hybrid controller for Robotiq’s 3-Finger Adapative gripper is available here. The underlying hybrid model is described in Technical Report: Use of Hybrid Systems to Model the Robotiq Adaptive Gripper.
Robot Object Retrieval with Contextual Natural Language Queries
This system enables a robot to request any object to complete a given task without explicitly classifying each available object. By providing a verb phrase such as “pass me something to cut with” rather than a noun phrase such as “pass me the scissors,” the system implicitly learns objects’ utilities and generalizes to novel objects.
Video credit: Thao Nguyen
Kuka iiwa Interface
Ben Abbatematteo and I developed the kuka_brown repository. kuka_brown contains all the code required to control our lab’s Kuka LBR iiwa robot arm, Robotiq 3-Finger Adaptive Gripper, and various depth sensors.
Grasp Pose Detection in Dense Clutter with a UR5
As a master’s student at Northeastern, I ported High Precision Grasp Pose Detection in Dense Clutter to our UR5 arm with a Robotiq parallel-jaw gripper and worked on various improvements to increase the grasp success rate.
Video credit: Andreas ten Pas