Tony Z. Zhao (@tonyzzhao) 's Twitter Profile
Tony Z. Zhao

@tonyzzhao

CS PhD student @Stanford. Aspiring full-stack roboticist. Prev Deepmind, Tesla, GoogleX, Berkeley.

ID: 1074448308578336768

linkhttps://tonyzhaozh.github.io/ calendar_today16-12-2018 23:36:53

329 Tweet

13,13K Followers

816 Following

Boyuan Chen (@boyuanchen0) 's Twitter Profile Photo

Introducing Diffusion Forcing, which unifies next-token prediction (eg LLMs) and full-seq. diffusion (eg SORA)! It offers improved performance & new sampling strategies in vision and robotics, such as stable, infinite video generation, better diffusion planning, and more! (1/8)

Tony Z. Zhao (@tonyzzhao) 's Twitter Profile Photo

Academia is catching up fast on humanoid research. To put it in perspective, this is: A team of 5 building with off-the-shelf hardware likely <200k in funding a few months of work. What's even better is all results are open-sourced. The capital efficiency is mind-blowing.

Huy Ha (@haqhuy) 's Twitter Profile Photo

I’ve been training dogs since middle school. It’s about time I train robot dogs too 😛 Introducing, UMI on Legs, an approach for scaling manipulation skills on robot dogs🐶It can toss, push heavy weights, and make your ~existing~ visuo-motor policies mobile!

Cheng Chi (@chichengcc) 's Twitter Profile Photo

End-to-end imitation for surgical robots! Careful action space design continue to play a key role in getting real-world systems to work

Remi Cadene (@remicadene) 's Twitter Profile Photo

Yesterday, I recorded 100 trajectories to teach my robot arm to grasp LEGO block Overnight, I trained a neural network on my MacBookPro. Today, it can grasp pretty well. No need for an expensive setup to get started with robotics. You just need github.com/huggingface/le…

Haozhi Qi (@haozhiq) 's Twitter Profile Photo

When I started my first project on in-hand manipulation, I thought it would be super cool but also quite challenging to make my robot hands spin pens. After almost 2.5 years of effort in this line of research, we have finally succeeded in making our robot hand "spin pens."

Qiayuan Liao (@qiayuanliao) 's Twitter Profile Photo

Excited to share a new humanoid robot platform we’ve been working on. Berkeley Humanoid is a reliable and low-cost mid-scale research platform for learning-based control. We demonstrate the robot walking on various terrains and dynamic hopping with a simple RL controller.

Zhengtong Xu (@xuzhengtong) 's Twitter Profile Photo

🤖 Introducing UniT, a tactile representation learning framework that 🧠 Simple-to-Train: Trained with data from a single simple object. 🥂 Plug-and-Play: Learns a unified representation adaptable to various tasks and objects. See thread for more details (1/4)

Tony Z. Zhao (@tonyzzhao) 's Twitter Profile Photo

Silicon Valley can keep convincing themselves they are the best at “software”, “AI”, until a Chinese hardware startup pulls off the best AI-driven locomotion on a production humanoid. New hardware companies from China won’t be satisfied with manufacturing, unlike their

Tony Z. Zhao (@tonyzzhao) 's Twitter Profile Photo

Amazing that you can now train ACT on a macbook, with built-in camera for observation and a <$500 robot to run the policy. A large step in democratizing robot learning. Congrats Remi Cadene Hugging Face!

Kevin Zakka (@kevin_zakka) 's Twitter Profile Photo

"We design and manufacture low-cost and easy-to-use devices for collecting human demonstration data" 🤔 they forgot to mention git clone umi-gripper.github.io