Projects

Our research is sponsored in part by NSF, NDSEG, USDA, Google,
Siemens, HP, Honda, Autodesk, Toyota Research Institute, Amazon
Robotics, Intel, Samsung, Cisco, and Intuitive Surgical.  The AUTOLAB
is located in 2111 Etcheverry Hall and 444 Soda Hall, UC Berkeley.

Check out what we are currently working on, and find opportunities to contribute! Includes open source projects and AUTOLab-specific utilities. For more projects, visit our Github page.

Dex-Net for Robust Robot Grasping

Combining analytic mechanics theory with Deep Learning using sim2real at scale facilitates highly efficient, robust, grasping that generalizes surprisingly well to new objects.

Cloud Robotics

We are exploring how the ability to access remote data, code, and processing can facilitate scalable robot learning.

Safety Augmented Value Estimation from Demonstrations (SAVED)

Safe Deep Model-Based RL for Sparse Cost Robotic Tasks.  Addressing the difficulty in hand-engineering a dense cost function, which can lead to unintended behavior, and dynamical uncertainty, which makes exploration and constraint satisfaction challenging.

Deep Learning from Demonstrations

We are developing new methods where humans can efficiently teach robots to robustly perform tasks such as surgical needle insertion, grasping in clutter, part singulation, rope-tying and assembly.

Robot-Assisted Surgery

We’re developing new geometric models and algorithms for automating surgical subtasks such as suturing and debridement using the da Vinci surgical robot.

AlphaGarden

Can Polyculture farming be automated?  An ongoing robotic artwork and research project that juxtaposes natural vs. artificial intelligence to reflect on the natural world and our role and place within it.

New Media Artforms

To discover what can be expressed with new technologies such as networks, robots, digital cameras, and sensors that could not previously be expressed, we’re designing art installations that explore issues related to cultural history, privacy, and “telepistemology: what is knowable at a distance.”

A Sample of Prior Projects:

  • Debridement–Fast and reliable autonomous surgical debridement.
  • DART— DART (Disturbances for Augmenting Robot Trajectories) collects demonstrations with injected noise, and optimizes the noise level to approximate the error of the robot’s trained policy during data collection.
  • GQ-CNN–Our Python package for grasp quality convolutional neural networks
  • YuMiPy–Our Python package for interfacing with ABB’s YuMi robot
  • MeshRender–A set of Python utilities for rendering 3D meshes with OpenGL
  • Perception–AUTOLab’s toolkit for robot perception tasks
  • Tele-ActorInternet-based “online robots” now provide public access to remote locations such as museums and laboratories. The Tele-Actor is a collaborative online teleoperation system for distance learning that allows many students to simultaneously share control of a single mobile resource.
  • Jester 5.0Jester is a joke recommender system designed to study social information filtering. It uses a collaborative filtering algorithm that uses universal queries to elicit real-valued user ratings on a common set of items and applies principal component analysis (PCA) to the resulting dense subset of the ratings matrix. PCA facilitates dimensionality reduction for offline clustering of users and rapid computation of recommendations.
  • ACONE, CONE(Automated) Collaborative Observatory for Natural Environments
  • ANAAnytime Nonparametric A* (ANA) Algorithm
  • Motion Planning in MedicineOptimization and Simulation Algorithms for Image-Guided Procedures (Monograph)
  • ALAN–Assembly Line Adaptive Netbot (ALAN), a proposed practical robot system for applications in manufacturing, material handling, and food production that is emerging from discussions between leaders from industry and academic research.
  • RAPID–Robot-Assisted Precision Irrigation Delivery (RAPID) is a co-robotic approach where a team of humans and robots move through the fields to adjust low-cost adjustable drip irrigation emitters at the plant level.