Program‎ > ‎

Abstracts and Bios

Monday, June 19th
Larry Matthies
Grand challenges in aerial robotics for planetary exploration
Abstract: Three solid bodies in the solar system besides Earth have enough atmosphere for aerial mobility to be an interesting possibility for exploration: Mars, Titan, and Venus. ³Aerial mobility² in this case can be interpreted to include aerial maneuvering for precision landing, as well as airborne systems to move from place to place during exploration, including balloons, planes, and rotorcraft. Mars has a very thin, transparent atmosphere, Titan has a much denser, very hazy, and very cold atmosphere, and Venus has an extremely dense, hot, opaque atmosphere; this leads to very different technical challenges for aerial mobility for each of these bodies. This talk will give a brief review of aerial mobility concepts that have been considered for these bodies, summarize current work toward rotorcraft for exploring Mars and Titan, and outline challenges for the future in this area.

Bio: Dr. Larry Matthies supervises the Computer Vision Group at the Jet Propulsion Laboratory, where he has contributed to the development of autonomous navigation systems rovers, landers, and orbiters for planetary exploration. He also conducts research on perception systems for autonomous navigation of unmanned ground and air vehicles on Earth. He is a Fellow of the IEEE, a member of the editorial boards of Autonomous Robots and the Journal of Field Robotics, and an adjunct professor of computer science at the University of Southern California. He obtained his PhD in computer science from Carnegie Mellon University in 1989.

Jonathan P. How
UAV Planning under Uncertainty: Theory and Practice
Abstract: This talk will describe recent progress in the Aerospace Controls Lab at MIT on planning and control of autonomous aerial systems operating in dynamic environments, with an emphasis on addressing the planning challenges faced on various timescales. For example, autonomous unmanned aerial vehicles (UAVs) need to plan/execute safe paths and avoid imminent collisions given noisy sensory information (short timescale) and perform complex cooperative tasks given imperfect models and knowledge of the environment (long timescale). On aerial systems, these tasks are often constrained to be done using lightweight, low-power onboard computation and perception, which typically adds significant complexity to the system. The talk will highlight several recently developed solutions to these challenges, including a fast local path planner for low-latency, high-speed navigation, the introduction of macro-observations for semantic level perception-enabled planning, and a framework leveraging temporal abstraction for multi-UAV planning on long timescales. These solution approaches are implemented on robotic hardware that have demonstrated high-speed agile flight of a quadrotor in unknown, cluttered environments and real-time cooperative multi-UAV planning with an onboard deep learning-based perception system. The talk will also showcase perception-in-the-loop planning using a new projector system, known as the Measurable Augmented Reality for Prototyping Cyber-Physical Systems (MAR-CPS), that is used to visually augment the planning and learning experiments in a motion-capture facility.

Bio: Dr. Jonathan P. How is the Richard C. Maclaurin Professor of Aeronautics and Astronautics at the Massachusetts Institute of Technology. He received a B.A.Sc. from the University of Toronto in 1987 and his S.M. and Ph.D. in Aeronautics and Astronautics from MIT in 1990 and 1993, respectively. He then studied for two years at MIT as a postdoctoral associate for the Middeck Active Control Experiment (MACE) that flew onboard the Space Shuttle Endeavour in March 1995. Prior to joining MIT in 2000, he was an Assistant Professor in the Department of Aeronautics and Astronautics at Stanford University.
He is the Editor-in-chief of the IEEE Control Systems Magazine and an Associate Editor for the AIAA Journal of Aerospace Information Systems. Professor How was the recipient of the 2002 Institute of Navigation Burka Award, a Boeing Special Invention award in 2008, the IFAC Automatica award for best applications paper in 2011, the AeroLion Technologies Outstanding Paper Award for the Journal Unmanned Systems in 2015, won the IEEE Control Systems Society Video Clip Contest in 2015, and received the AIAA Best Paper in Conference Awards in 2011, 2012, and 2013. He is a Fellow of AIAA and a senior member of IEEE.

Lakmal Seneviratne
Role of Competitions in Technology Innovations in Micro Unmanned Aerial Vehicles
Abstract: Micro unmanned aerial vehicles (UAVs) have recently emerged as a powerful technology with the potential to have a major economic and societal impact. This has resulted in continuing investments in UAVs, in anticipation of new market opportunities. These new markets will require UAVs to work in crowded, unstructured, dynamic environments, while operating with increased autonomy in the civilian airspace, and maintaining safety. These challenges are unsolved and are active topics of research and development. These unsolved problems have also given rise to an increasing number of global robotics competitions. Competitions have the potential to accelerate innovations, provide application focused solutions, while calibrating the gap between expectations and reality. In this presentation we will review and discuss the role of international competitions in catalyzing future technological innovations in micro unmanned aerial vehicles, with a particular focus on the recently completed Mohammed Bin Zayed International Robotics Challenge.

Bio: Lakmal Seneviratne is the founding Director of the Robotics Institute, Associate VP for Research and Professor of Mechanical Engineering at Khalifa University, UAE. He is also the Technical Director of the Mohammed Bin Zayed International Robotics Challenge. Prior to joining Khalifa University he was Professor of Mechatronics, the founding Director of the Centre for Robotics Research and the Head of the Division of Engineering, at King’s College London. His main research interests are centred on robotics and automation, with particular emphasis on increasing the autonomy of robotic systems interacting with complex dynamic environments. He has published over 350 peer reviewed publications on these topics.

Debadeepta Dey
Adaptive Information Gathering via Imitation Learning
Abstract: In the adaptive information gathering problem, a robot is required to select an informative sensing location using the history of measurements acquired thus far. While there is an extensive amount of prior work investigating effective practical approximations using variants of Shannon’s entropy, the efficacy of such policies heavily depends on the geometric distribution of objects in the world. On the other hand, the principled approach of employing online POMDP solvers is rendered impractical by the need to explicitly sample online from a posterior distribution of world maps. We present a novel data-driven imitation learning framework to efficiently train information gathering policies. The policy imitates a clairvoyant oracle - an oracle that at train time has full knowledge about the world map and can compute maximally informative sensing locations. We analyze the learnt policy by showing that offline imitation of a clairvoyant oracle is implicitly equivalent to online oracle execution in conjunction with posterior sampling. This observation allows us to obtain powerful near-optimality guarantees for information gathering problems possessing an adaptive submodularity property. As we demonstrate on a spectrum of 2D and 3D exploration problems with aerial vehicles, the trained policies enjoy the best of both worlds - they adapt to different world map distributions while being computationally inexpensive to evaluate.
In closely related very recent work, I will highlight how to leverage submodularity to plan efficient trajectories for 3D reconstruction which are environment and battery constraint-aware resulting in better reconstructions compared to the state-of-the-art for the same battery budget.

Bio: Debadeepta Dey is a researcher in the Adaptive Systems and Interaction (ASI) group led by Dr.Eric Horvitz at Microsoft Research, Redmond. He received his PhD at the Robotics Institute, Carnegie Mellon University, advised by Prof. Drew Bagnell.
He is interested in bridging the gap between perception and planning for autonomous ground and aerial vehicles. His interests include decision-making under uncertainty, reinforcement learning, planning and perception. Nowadays he is especially interested in aerial vehicle autonomy ranging from small quadrotors to large gliders and balances his time between fundamental theoretical advances as well as pushing the state-of-the-art in realizable practical systems. 

Giuseppe Loianno
Flying Robots: Agile Autonomous Navigation and Physical Interaction
Abstract: Flying Robots are starting to play a major role in several tasks such as search and rescue, interaction with the environment, inspection and monitoring. Unfortunately, their dynamics make them extremely difficult to control and this is particularly true in absence of external positioning systems, such as GPS and motion-capture systems. Additionally, autonomous maneuvers based on onboard sensors are still very slow compared to those attainable with motion capture systems. Aggressive and agile navigation with Micro Aerial Vehicles (MAVs) through unknown environments poses a number of challenges in terms of perception, state estimation, planning, and control. To achieve this, MAVs have to localize themselves in unstructured environments. This in turn requires the MAV to use a combination of absolute or relative asynchronous measurements provided by different noisy sensors at different rates which have to be fused to obtain a reliable state estimate at rates of above 200 Hz. In this talk, we will present recent research results on the pose estimation and planning problems for agile and aggressive flights andphysical interaction using a minimal on-board sensor suite composed mainly by a single camera system and an Inertial Measurement Unit (IMU). For truly autonomous agile navigation, the perception, planning and control problems have to be solved concurrently. We will demonstrate how these different technologies can be combined to provide an integrate and robust solution enabling aggressive and agile flight maneuvers with MAVs in different scenarios including the possibility to interact with the environment.

Bio: Giuseppe Loianno is a research scientist in the General Robotics, Automation, Sensing and Perception Laboratory, GRASP lab at University of Pennsylvania. He received the B.Sc. and M.Sc. degrees in automation engineering, both with honors, from the University of Naples "Federico II" in December 2007 and February 2010, respectively. He received his Ph.D. in computer and control engineering focusing in robotics in May 2014 in the PRISMA Lab group, led by Prof. Dr. Bruno Siciliano.He has been involved in the EU FP7 project AIRobots (www.airobots.eu) in sensor fusion and visual control. He has published more than 35 conference papers, journal papers, and book chapters. Beginning in April 2013, he worked for 14 months with the GRASP Lab at the University of Pennsylvania, supervised by Prof. Dr. Vijay Kumar. From June 2014 to July 2015, he was a postdoctoral researcher in his lab, where he is currently a research scientist. His research interests include visual odometry, sensor fusion, and visual servoing for micro aerial vehicles. He received the Conference Editorial Board Best Reviewer Award at ICRA 2016. His work has been featured in a large number of renowned worldwide news and magazines such as IEEE Spectrum and MIT technology review.


Kostas Daniilidis
Event-based Vision for Challenging Speeds and Illuminations
Abstract: Asynchronous event-based sensors enable UAVs to capture their environment fast, under low light or high dynamic range conditions, or motion blur. However, traditional frame-based vision approaches cannot be applied since there is no predefined spatiotemporal neighborhood. We introduce a new approach for feature tracking in event-based systems that associates events with features probabilistically and enables persistent tracking over time at very high speeds. Based on the computed optical flow, a temporal neighborhood is defined based on bounds on displacement rather than a time window. We use this feature tracking, to develop an event-based visual-inertial odometry system that can operate under challenging speed and illumination conditions.

Bio: Kostas Daniilidis is the Ruth Yalom Stone Professor of Computer andInformation Science at the University of Pennsylvania where he has been faculty since 1998. He is an IEEE Fellow. He was the director of the interdisciplinary GRASP laboratory from 2008 to 2013, Associate Dean for Graduate Education from 2012-2016, and Faculty Director of Online Learning since 2016. He obtained his undergraduate degree in Electrical Engineering from the National Technical University of Athens, 1986, and his PhD (Dr.rer.nat.) in Computer Science from the University of Karlsruhe, 1992, under the supervision of Hans-Hellmut Nagel. He was Associate Editor of IEEE Transactions on Pattern Analysis and Machine Intelligence from 2003 to 2007. He co-chaired with Pollefeys IEEE 3DPVT 2006, and he was Program co-chair of ECCV 2010. His most cited works have been on visual odometry, omnidirectional vision, 3D pose estimation, 3D registration, hand-eye calibration, structure from motion, and image matching. Kostas’ main interest today is in deep learning of 3D representations, data association, event-based cameras, semantic localization and mapping, and vision based manipulation.


TECHNOLOGY, POLICY, AND INNOVATION PANEL
with:
Jon Resnick, Policy Lead, DJI
Sunmin Kim, Congressional Innovation Fellow, U.S. Senate
Nader Elm, CEO, Exyn Technologies, Inc.
Archna Sahay, Director, Entrepreneurial Investment, City of Philadelphia
Ellen Hwang, Program Manager for Innovation Management, City of Philadelphia



Tuesday, June 20th
Anibal Ollero
Advances in aerial robotic manipulation for applications in inspection and maintenance 
Abstract: This presentation will deal with new methods and technologies developed mainly in the H2020 AEROARMS project on aerial robotics with multiple arms and advanced manipulation capabilities for inspection and maintenance. Particularly, new aerial robots for compliant manipulation with two arms while flying will be presented. The talk will also include perception, including localization and visual servoing without markers, as well as local planning systems to generate safe reactions in constrained environments near the objects being manipulated. Finally, the application cases of the project will be introduced.

Bio: Full professor, head of GRVC (70 members), University of Seville, and Scientific Advisor of the Center for Advanced Aerospace Technologies (CATEC) in Seville (Spain). He has been full professor at the Universities of Santiago and Malaga (Spain), Scientific Director of CATEC, researcher at the Robotics Institute of Carnegie Mellon University (Pittsburgh, USA) and LAAS-CNRS (Toulouse, France). He authored more than 650 publications, including 9 books and 141 journal papers and led more than 150 projects, transferring results to many companies. He has participated in 22 European Projects being coordinator of 6, including the recently concluded FP7-EC integrated projects ARCAS and EC-SAFEMOBIL and the on-going H2020 AEROARMS. He is recipient of 17 awards, has supervised 33 PhD Thesis and is currently co-chair of the IEEE Technical Committee on Aerial Robotics and Unmanned Aerial Vehicles, member of the Board of Directors and coordinator of the Aerial Robotics Topic Group of euRobotics and president of the Spanish Society for Research and Development in Robotics. He has been in several Advisory Committees including the CNRS (France) Conseil Scientifique. He also was member of the EURON board, ViceChair of the IFAC Technical Board and Coordinating Committee Chair and Technical Committee Chair of IFAC.


Shaojie Shen
Hong Kong University of Science and Technology
Director, HKUST-DJI Joint Innovation Laboratory
Autonomous Aerial Navigation using Monocular Visual-Inertial Fusion
I will present recent results in the HKUST Aerial Robotics Group towards robust autonomous navigation using only the sensory information from a monocular camera and an IMU. We argue that this is the minimum sensor suite that enables the full capability for autonomous navigation, including state estimation, dense mapping, trajectory planning, and feedback control. We will present a self-calibrating visual-inertial state estimator that not only provides accurate local state estimates for feedback control, but also achieves global consistency through large-scale graph optimization. Building on top of the estimator, we propose an onboard method to generate large-scale dense maps that are sufficient for obstacle detection and avoidance. We close the perception-action loop by an optimization-based trajectory generation method that finds safe trajectories in an online fashion. Experimental results are demonstrated on our custom-built quadrotor testbed.

Bio: Shaojie Shen received his B.Eng. degree in Electronic Engineering from  the Hong Kong University of Science and Technology in 2009. He received his M.S. in Robotics and Ph.D. in Electrical and Systems Engineering in 2011 and 2014, respectively, all from the University of Pennsylvania. He joined the Department of Electronic and Computer Engineering at the Hong Kong University of Science and Technology in September 2014 as an Assistant Professor. He is the founding director of the HKUST-DJI Joint Innovation Laboratory. His research interests are in the areas of robotics and unmanned aerial vehicles, with focus on state estimation, sensor fusion, computer vision, localization and mapping, and autonomous navigation in complex environments.  He and his research team won the best paper finalist in ICRA2011, best service robot finalist in ICRA2017, best paper awards in SSRR2015 and SSRR2016, and first price in IARC2015.

Chad Sweet
Qualcomm Technologies, Inc. Research
Aerial Robotics Powered by Snapdragon
Abstract: The technological advancement in aerial robotics continues to move at a robust pace. Qualcomm Research is developing the next-generation of high bandwidth sensory processing to enable the future of autonomous navigation for aerial robotics.

Bio: Chad Sweet, Sr. Director Engineering for Qualcomm Research, will discuss the technologies being developed at Qualcomm Research and what the future may hold.

David Baran
Army Research Lab
DOD Research in UAVs
Mr. David Baran is a Team Lead in the US Army Research Laboratory's (ARL) Computational and Information Sciences Directorate.   Mr. Baran is the lead of the Joint Experiments Thrust of the ARL Micro Autonomous Systems Technologies (MAST) CTA.  Mr. Baran has over 10 years of experience in conducting autonomy research for intelligent systems, with a focus on robotics.  Mr. Baran's research interests include the experimental validation of autonomy algorithms for small ground and aerial platforms, including mapping, exploration, path planning, and multi-agent collaboration.

Camillo Jose Taylor
Bio: Dr. Taylor received his A.B. degree in Electrical Computer and Systems Engineering from Harvard College in 1988 and his M.S. and Ph.D. degrees from Yale University in 1990 and 1994 respectively. Dr. Taylor was the Jamaica Scholar in 1984, a member of the Harvard chapter of Phi Beta Kappa and held a Harvard College Scholarship from 1986-1988. From 1994 to 1997 Dr. Taylor was a postdoctoral researcher and lecturer with the Department of Electrical Engineering and Computer Science at the University of California, Berkeley. He joined the faculty of the Computer and Information Science Department at the University of Pennsylvania in September 1997. He received an NSF CAREER award in 1998 and the Lindback Minority Junior Faculty Award in 2001. In 2012 he received a best paper award at the IEEE Workshop on the Applications of Computer Vision. Dr Taylor's research interests lie primarily in the fields of Computer Vision and Robotics and include: reconstruction of 3D models from images, vision-guided robot navigation and smart camera networks. Dr. Taylor has served as an Associate Editor of the IEEE Transactions of Pattern Analysis and Machine Intelligence. He has also served on numerous conference organizing committees and was a Program Chair of the 2006 edition of the IEEE Conference on Computer Vision and Pattern Recognition and of the 2013 edition of 3DV. In 2012 he was awarded the Christian R. and Mary F. Lindback Foundation Award for Distinguished Teaching at the University of Pennsylvania.

Andrew Browning
High speed reactive obstacle avoidance and aperture traversal using a monocular camera 
Authors: N. Andrew Browning & Hector Escobar-Alvarez 
Abstract: Flight in cluttered indoor and outdoor environments requires effective detection of obstacles and rapid trajectory updates to ensure successful avoidance. We present a low-computation, monocular-camera based solution that rapidly assesses collision risk in the environment through the computation of Expansion Rate, and fuses this with the range and bearing to a goal location (or object), in a Steering Field to steer around obstacles while flying towards the goal. The Steering Field provides instantaneous steering decisions based on the current collision risk in the environment, Expansion Rate provides an automatically-speed-scaled estimate of collision risk. Results from recent flight tests will be shown with flights at up to 20m/s around man-made and natural obstacles and through 5x5m apertures.

Bio: Dr. N. Andrew Browning obtained his PhD in Computational Neuroscience from Boston University with a thesis on how primates and humans process visual information for reactive navigation, the resulting neural model was built into a multi-layer convolutional neural network (called ViSTARS) and demonstrated, in simulation, to generate human-like trajectories in cluttered reactive navigation tasks. Following his PhD, applied post-doctoral research, and a brief stint as a Research Assistant Professor at BU, Dr. Browning started a research group at Scientific Systems Company Inc. (SSCI) to develop Active Perception and Cognitive Learning (APCL) systems as applied to autonomous robotic systems. The APCL lab at SSCI has developed into a global leader in the development of applied perception and autonomy solutions for small UAVs. Dr. Browning is now Deputy Director of Research and Development at SSCI with a broad remit across the areas of advanced controls, single vehicle and collaborative autonomy, visual perception, acoustic perception, and vision-aided GNC.
 

TECHNOLOGY, INNOVATION, AND CAPITAL PANEL
with:
Nader Elm, Exyn Technologies
Damon Henry, Asylon
Matthew Piccoli, iQinetics
Denis Dancanet, Jetoptera
Brett Topche, Red & Blue Ventures
Doc Parghi, SRI Capital
Katherine O'Neill, Jumpstart NJ Angel Network
Michael Poisel, Penn Center for Innovation Ventures

Zhiyuan Li
DJI

Michael Shomin & Matthew Turpin
Qualcomm Technologies, Inc.

Comments