Robert X. Gao
Cady Staley Professor
Chair of Department of Mechanical and Aerospace Engineering
Case Western Reserve University
Data-Enhanced Mechatronic Systems for Smart Manufacturing
Date & Location:
June 28th 8:30-9:30 am (Wednesday), Cascade Ballroom
Driven by the exponential growth of data from widespread deployment of sensors and the continued advancement of computational infrastructure, AI/machine learning has been continually transforming the state of mechanical, electrical, and computer engineering. The transformation has led to a data-driven paradigm that complements and augments model-based techniques in the design and optimization of mechatronic systems for information acquisition, optimization, and control. The outcome is improved functionality, efficiency, and reliability of mechatronic systems to advance the state of smart manufacturing.
This presentation traces the data-enabled pathway towards integrating physics-based and data-driven methods for mechatronic systems in manufacturing. Recent progress in sensing, process monitoring, and process control enabled by this integration is illustrated. New research trends, such as concerted hardware-software co-design, are discussed. The presentation demonstrates the potential of data-driven methods as a key enabler to complement physical science for advancing mechatronics in realizing smart manufacturing.
Robert Gao is the Cady Staley Professor of Engineering and Department Chair of Mechanical and Aerospace Engineering at Case Western Reserve University in Cleveland, Ohio. Since receiving his Ph.D. degree from the Technical University of Berlin, Germany in 1991, he has been working on physics-based signal transduction mechanisms, stochastic modeling, and mechatronic system design, and AI/ML-based data analytics for improving the observability of cyber-physical systems such as manufacturing machines, with the goal to improve process and product quality control.
Professor Gao is a Fellow of the ASME, SME, IEEE, CIRP, and a Distinguished Fellow of the International Institute of Acoustics and Vibration (IIAV). He has published over 400 technical papers, including more than 190 journal articles, three books, and holds 13 patents. He has received several professional awards, including the ASME Milton C. Shaw Manufacturing Research Medal (2023), ASME Blackall Machine Tool and Gage Award (2018), SME Eli Whitney Productivity Award (2019), IEEE Instrumentation and Measurement Society Technical Award (2013), IEEE Best Application in Instrumental and Measurement Award (2019), Hideo Hanafusa Outstanding Investigator Award (2018), and several Best Paper awards. Prof. Gao is the Chair of the Scientific Committee of the North American Manufacturing Research Institute (NAMRI/SME) and Chair of the Collaborative Working Group on AI in Manufacturing (CWG-AI) of CIRP. He has served as an Associate Editor for several journals, and is currently a Senior Editor for the IEEE/ASME Transactions on Mechatronics.
Philip L. Freeman
Senior Technical Fellow
Assembly Automation & Robotics
Boeing Research & Technology
From R&D to Production: Challenges in automation for aerospace
Date & Location:
June 28th 1:30-2:30 pm (Wednesday), Cascade Ballroom
Aerospace has always been a challenging environment for automation. Long takt times, low volumes, high variation, and requirements for high precision make it challenging to transition laboratory R&D to qualified production automation. Boeing has a decades long history in developing, advancing, and deploying, automation for aerospace production. As we look into the future of production, we see new opportunities to accelerate the development and transition of new automation from R&D to production ready. In this presentation, we share some examples of automation at Boeing, current work in leveraging autonomy and robotics, and opportunities in simulation and virtual commissioning to accelerate development, qualification, and deployment of production ready systems.
Dr. Phil Freeman is a Senior Technical Fellow in Boeing Research and Technology (BR&T) focused on Advanced Production Systems, Assembly Automation, & Precision Robotics. As a Senior Technical Fellow in the area of Materials and Manufacturing Technology, Dr. Freeman has expertise in robotics, automation, and control. He works from Boeing's Research and Technology Center in South Carolina. From 2012 to 2014, Dr. Freeman worked with BR&T South Carolina on 787 production support, helping the program meet production ramp up rate targets. Prior to that, he worked in the Assembly and Integration Technology team in St. Louis where he helped implement many of the automated drilling systems on the F/A-18 and F-15. Previously, he worked as Boeing's liaison to the Advanced Manufacturing Research Centre in Sheffield, UK where he led the Centre's development of an automated assembly research team, now the AMRC's Integrated Manufacturing Group (IMG). Since joining Boeing in 1998, Dr. Freeman's research work has been primarily focused on improving the accuracy of precision automated drilling and milling systems through accurate kinematics modeling and the use of robust machine vision. He holds over 30 patents covering a range of manufacturing technologies, and is an author on several publications in machine tool volumetric accuracy and machine vision for inspection. His current focus is leveraging simulation and model-based engineering to reduce the startup time of new automation technologies. Dr. Freeman is a member of American Society of Mechanical Engineers (ASME) where he is on the Board of Strategic Initiatives, serves as the vice chairperson for ASME B5.TC52 standards committee on machine tool performance, and is a contributing member to the Subcommittee on Robotic Arms (Manipulators). He is also a member of the Institute of Electrical and Electronic Engineers (IEEE) where he previously served on the industrial advisory board for the Robotics and Automation Society (RAS). Dr. Freeman earned his D.Sc. in System Science and Mathematics (2012), his M.S. in Mechanical Engineering (2003), and his B.S. in Mechanical Engineering (1997) all from Washington University in St. Louis.
IEEE RAS Distinguished Lecturer
Director of the Advanced Robotics and Controls Laboratory (ARCLab)
UC San Diego Contextual Robotics Institute
The new age of learning-based robot motion planning
Date & Location:
June 29th 8:30-9:30 am (Thursday), Cascade Ballroom
Robots and other autonomous systems need to understand how to move in complex and dynamic environments while avoiding or minimizing unwanted contact.
With over 40 years of evolution, classical motion planning solutions have been hitting practical limits in solving many real-world environments due
to their unpredictability as well as the curse-of-dimensionality. Even with today's best algorithms, we often experience unsatisfactory behaviors or
performance: with robots taking many seconds or even minutes to think before they move, and even then, the movement may appear unusually roundabout and
suboptimal. Higher-level considerations, including safety, responsiveness, and accounting for uncertainty can also add significant challenges.
Now, Machine Learning has arrived to the motion planning problem and promises to overcome the current limitations of our classical techniques and provide
a transformative leap in autonomous planning and control. How does it manage to achieve this? In this talk, I will introduce our work in motion
planning networks that started this path toward neural planners, breaking the mold of how robots should plan for navigation. In both simulation
and real-world examples, we show how this research area has grown to solve multi-manipulator coordination, task and motion planning, kinodynamically
constrained motion planners, autonomous driving, and more.
Michael Yip is an Associate Professor at the UC San Diego Contextual Robotics Institute, IEEE RAS Distinguished Lecturer, Hellman Fellow, and Director
of the Advanced Robotics and Controls Laboratory (ARCLab). His group currently focuses on solving problems in data-efficient and computationally efficient
robot control and motion planning through the use of various forms of learning representations, including deep learning and reinforcement learning strategies.
These techniques focus on solving problems with robot manipulation and locomotion on novel, dextrous platforms, include surgical robot manipulators,
continuum robots, snake-like robots, and vehicular systems. His work has been recognized through several best paper awards and nominations at ICRA and IROS,
the 2017 best paper award for IEEE Robot and Automation Letters, and received the NSF CAREER and the NIH Trailblazer awards.
Dr. Yip was previously a Research Associate with Disney Research Los Angeles, and Visiting Professors at Stanford University and at Amazon Robotics -
Machine Learning and Computer Vision Group. He received a B.Sc. in Mechatronics Engineering from the University of Waterloo, an M.S. in Electrical
Engineering from the University of British Columbia, and a Ph.D. in Bioengineering from Stanford University.
Chief Information Officer (CIO) of Opener
Working from home is nice, but flying to work is better
Date & Location:
June 29th 1:30-2:30 pm (Thursday), Cascade Ballroom
How would you like to climb into your personal aircraft, take off, and be whisked away to your destination? For recreation, you could soar over trees, rivers,
and hillsides, marvel at the earth's beauty below, travel to locations not reachable by car, and relish in remote areas of nature. For work, you could dash high
above commuter traffic, as the crow flies, arrive well rested and ready to get things done, and interact with colleagues while suppressing a grin. We at Opener
are taking steps toward making this dream come true with the personal aerial vehicle called BlackFly. Classified as an ultralight, BlackFly can be flown today
in non-congested areas. Taking off and landing vertically eliminates the need for a runway, and no pilot's license is required. In this talk, I'll describe what
it means to be an ultralight vehicle, discuss the technological advances that came together to enable the creation of BlackFly, share some key considerations in
the design and development of personal aerial vehicles, and summarize how far we've come. Throughout my talk, I'll share videos tracing Blackfly's evolution. So
buckle your seat belt and get ready to take off: Watching BlackFly in action, you'll share in the thrill of three-dimensional freedom.
Celia Oakley is the Chief Information Officer (CIO) of Opener, where she has worked for more than eight years on BlackFly, the company's groundbreaking electric
personal flying eVTOL vehicle. After designing and implementing Opener's flight testing program, she moved on to oversee the development of Opener's information
systems: internal custom web applications, cloud-to-aircraft communication, website, enterprise software platforms, and IT. Before arriving at Opener, Dr. Oakley
was a member of the Stanford Racing Team that created Stanley, the world's first successful self-driving car, which won the DARPA Grand Challenge in 2005.
Dr. Oakley received her B.S. in Mechanical Engineering from U.C. Berkeley and her M.S. and Ph.D. in Mechanical Engineering, with a Minor in Computer Science,
from Stanford University.
MSU Foundation Professor
Richard M. Hong Endowed Chair of
Electrical and Computer Engineering
Michigan State University (MSU)
Sea lamprey, e-skin, and robotic fish: Mechatronic solutions to invasive species control
Date & Location:
June 30th 8:30-9:30 am (Friday), Cascade Ballroom
The sea lamprey, sometimes known as "vampire fish", is an invasive species in the Great Lakes region that threatens its ecosystems and billion-dollar fisheries.
The parasitic sea lamprey uses suctorial mouth to prey on various host fish by attaching to the fish and draining its body fluids. In this talk we first describe
our effort in developing a soft pressure sensor array as an electronic skin (e-skin), for detecting the suction by adult sea lampreys during their upstream
migration for spawning. Such e-skins can be mounted at strategically chosen places, such as selective fishways, to facilitate the capture and population
assessment of sea lampreys. We discuss regularized least-square algorithms for mitigating the crosstalk in the resistor network of the sensor array, to properly
reconstruct the pressure profile under lamprey suction. Machine learning is further adopted to automate the lamprey detection process, as verified with data
from animal experiments
In the second part of the talk we explore tracking the movement of fish, such as sea lampreys, with mobile acoustic telemetry, which provides key information about
fish migration patterns and habitat uses and is thus critical to decision-making in fishery management. In mobile acoustic telemetry, acoustic tags are implanted
in fish and emit pings periodically, which are picked up by acoustic receivers mounted on robots to infer the fish location. We discuss the use of gliding
robotic fish and unmanned surface vehicles for tracking acoustic tags, and specifically, we show how distributed filtering by a group of robots can result in
localization of a moving target based on the time-difference-of-arrivals (TDOAs) of the emitted signal.
Dr. Xiaobo Tan is an MSU Foundation Professor and the Richard M. Hong Endowed Chair in Electrical and Computer Engineering at Michigan State University (MSU). He received his bachelor's and master's degrees in automatic control from Tsinghua University, Beijing, China, in 1995, 1998, respectively, and his Ph.D. in electrical and computer engineering (ECE) from the University of Maryland in 2002. His research interests include bio-inspired robots, soft sensors and actuators, and modeling and control of systems with hysteresis. In particular, his group has developed and field-tested autonomous underwater and surface robots for mobile sensing applications. He has published over 300 papers and been awarded four US patents in these areas.
Dr. Tan is a Fellow of IEEE and ASME. He is a recipient of the NSF CAREER Award (2006), MSU Teacher-Scholar Award (2010), MSU College of Engineering Withrow Distinguished Scholar Award (2018), Distinguished Alumni Award from the ECE Department at University of Maryland (2018), MSU William J. Beal Outstanding Faculty Award (2023), and multiple best paper awards. Dr. Tan is keen to integrate his research with educational and outreach activities, and has served as Director of an NSF-funded Research Experiences for Teachers (RET) Site program at MSU from 2009 - 2016 and Curator of a robotic fish exhibit at MSU Museum in 2016-2017.
Beyond Conventional Interfaces: Exploring the Intersection of Wearable Technologies, Textiles, and Physical Computing
Date & Location:
June 30th 1:30-2:30 pm (Friday), Cascade Ballroom
How can physical computing and interactive fabrics change the way we engage with everyday objects and environments? In this talk, I delve into the transformative
potential of applying physical computing principles to wearable technologies and smart textiles, highlighting different breakthroughs such as pocket-based textile
sensors capable of detecting user input and recognizing objects carried in our pockets, as well as new touch-sensitive interfaces that leverage different
materials like graphene-based fabrics, among others. These types of advancements tease a new era in human-computer interaction, where the seamless integration
of wearable technologies, textiles, and physical computing lead to novel, intuitive, and context-aware interactions with the objects and surroundings in our
daily lives. As the field is rapidly progressing from fundamental research to commercialization, this talk will showcase the state-of-the-art in the cross-section
of domains, as well as describe future research directions and applications that will reshape the way we experience and interact with the world around us.
Dr. Teddy Seyed is a Senior Researcher at Microsoft Research, located in Redmond, WA, USA. He holds the distinction of being the first in Canada to receive an Entrepreneurial PhD in Computer Science, earned from the University of Calgary. His PhD dissertation also won the Bill Buxton Award for best Human-Computer Interaction (HCI) dissertation in Canada. Dr. Seyed's research primarily focuses on Human-Computer Interaction (HCI) for the development and exploration of wearables, fashion-technology, physical computing, new devices and modalities. His work has been featured in publications such as Forbes Magazine and Gizmodo.
In addition to his research pursuits, Dr. Seyed has a strong entrepreneurial spirit, co-founding several startups, successfully completing crowdfunding campaigns, shipping products, and participating in competitive business accelerators. Currently, he leads the Future of Wearables mini-group at Microsoft Research.