Press "Enter" to skip to content

Posts tagged as “Embodied AI”

[twenty twenty four day one two five]: this is it

centaur 0

This is it. Today, the 4th, is the last day to submit papers to the Embodied AI Workshop 2024, and we are not going to extend this deadline because we've gotten enough submissions so far that we, um, don't need to.

One more last time, the CFP:

Call for Papers

We invite high-quality 2-page extended abstracts on embodied AI, especially in areas relevant to the themes of this year's workshop:

  • Open-World AI for Embodied AI
  • Generative AI for Embodied AI
  • Embodied Mobile Manipulation
  • Language Model Planning

as well as themes related to embodied AI in general:

  • Simulation Environments
  • Visual Navigation
  • Rearrangement
  • Embodied Question Answering
  • Embodied Vision & Language

Accepted papers will be presented as posters or spotlight talks at the workshop. https://embodied-ai.org/#call-for-papers

Papers are due TODAY "anywhere on Earth" (as long as it is still today, your time).

Please send us what you've got!

-the Centaur

[twenty twenty-four day one two four]: last call for embodied ai papers!

centaur 0

Hey folks! Today (Saturday May 4th) is the last day to submit papers to the Embodied AI Workshop 2024!

Call for Papers

We invite high-quality 2-page extended abstracts on embodied AI, especially in areas relevant to the themes of this year's workshop:

  • Open-World AI for Embodied AI
  • Generative AI for Embodied AI
  • Embodied Mobile Manipulation
  • Language Model Planning

as well as themes related to embodied AI in general:

  • Simulation Environments
  • Visual Navigation
  • Rearrangement
  • Embodied Question Answering
  • Embodied Vision & Language

Accepted papers will be presented as posters or spotlight talks at the workshop. 

https://embodied-ai.org/#call-for-papers

Please send us what you've got! Just between you and me and the fencepost, if we get about 7+/-2 more submissions, we'll have enough to call it done for the year and won't need to extend the CFP, so we can get on with reviewing the papers and preparing for the workshop. So please submit!

-the Centaur

Pictured: the very nice logo for the Embodied AI Workshop, a joint effort of me, my co-organizer Claudia, and I think one of Midjourney or DALL-E. Yes, there's generative AI in there, but it took a good bit of prompting to get the core art, and lot of work in Photoshop after that to make it usable.

Embodied AI Workshop Call for Papers Still Open!

centaur 0

Our call for papers is still open at https://embodied-ai.org/#call-for-papers through May 4th! We're particularly interested in two-page abstracts on the theme of the workshop:

  • Open-World AI for Embodied AI
  • Generative AI for Embodied AI
  • Embodied Mobile Manipulation
  • Language Model Planning

Submissions are accepted through May 4th AOE (Anywhere on Earth) at https://openreview.net/group?id=thecvf.com/CVPR/2024/Workshop/EAI#tab-recent-activity ...

-the Centaur

Announcing the 5th Annual Embodied AI Workshop

centaur 0

Thank goodness! At last, I'm happy to announce the Fifth Annual Embodied AI Workshop, held this year in Seattle as part of CVPR 2024! This workshop brings together vision researchers and roboticists to explore how having a body affects the problems you need to solve with your mind.

This year's workshop theme is "Open-World Embodied AI" - embodied AI when you cannot fully specify the tasks or their targets at the start of your problem. We have three subthemes:

  • Embodied Mobile Manipulation: Going beyond our traditional manipulation and navigation challenges, this topic focuses on moving objects through space at the same time as moving yourself.
  • Generative AI for Embodied AI: Building datasets for embodied AI is challenging, but we've made a lot of progress using "synthetic" data to expand these datasets.
  • Language Model Planning: Lastly but not leastly, a topic near and dear to my heart: using large language models as a core technology for planning with robotic systems.

The workshop will have six speakers and presentations from six challenges, and perhaps a sponsor or two. Please come join us at CVPR, though we also plan to support hybrid attendance.

Presumably, the workshop location will look something like the above, so we hope to see you there!

-the Centaur

Pictured: the banner for EAI#5, partially done with generative AI guided by my colleague Claudia Perez D'Arpino and Photoshoppery done by me. Also, last year's workshop entrance.

[seventy-nine] minus ninety-two: it’s OVER

centaur 0

After almost a year's worth of work, at last, the Fourth Annual Embodied Artificial Intelligence Workshop is OVER! I will go collapse now. Actually, it was over last night, and I actually did collapse, briefly, on the stairs leading up to my bedroom after the workshop was finally done. But don't worry, I was all right. I was just so relieved that it was good to finally, briefly, collapse. A full report on this tomorrow. Off to bed.

-the Centaur

Pictured: A rainbow that appeared in the sky just as the workshop was ending. Thanks, God!

Announcing the Embodied AI Workshop #4 at CVPR 2023

centaur 0

Hey folks, I am proud to announce the 4th annual Embodied AI Workshop, held once again at CVPR 2023! EAI is a multidisciplinary workshop bringing computer vision researchers, machine learning researchers and roboticists to study the problem of creating intelligent systems that interact with their worlds.

For a highlight of previous workshops, see our Retrospectives paper. This year, EAI #4 will feature dozens of researchers, over twenty participating institutions, and ten distinct embodied AI challenges. Our three main themes for this year's workshop are:

  • Foundation Models: large, pretrained models that can solve many tasks few-shot or zero-shot
  • Generalist Agents: agents capable of solving a wide variety of problems
  • Sim to Real Transfer: learning in simulation but deploying in reality.

We will have presentations from all the challenges discussing their tasks, progress in the community, and winning approaches. We will also have six speakers on a variety of topics, and at the end of the workshop I'll be moderating a panel discussion among them.

I hope you can join us, in the real or virtually, at EAI #4 at CVPR 2023 in Vancouver!

-the Centaur

I’m scared of Homebrew’s installation procedure, but I still love Homebrew

centaur 0

This is a followup to my Making Computers Useful series, started all the way back in 2014. (Funnily enough, the 2013-era iMac featured in that series is now pretty damn useless as it has fallen out of update range, and locks up if you run Dropbox and Google Drive at the same time).

But, the overall goal here is to document some of the stuff that I need to do to make computers work for me. Typically, there’s a lot of friction in software, and it takes a good bit of work to make that all function on a new machine. Sometimes that becomes a deep dive.

This is one of those stories.

So today, while updating the Embodied AI Workshop’s website prior to the launch of the 2023 version, I wanted to run the tree command. Tree is great because it can help you understand the structure of a directory tree, like so:

I felt I needed this because the Embodied AI website is built on yarn and gatsby, and it turned a relatively simple site into 1.6 gigabytes of generated junk (which I noticed when one of my older computers started literally wheezing as it tried to mirror all those unexpected files):

As it turns out, you can get tree via Homebrew. Homebrew is a “package manager,” kind of like an “app store for the command line,” and Homebrew helps you get standard Linux tools, like tree, onto your Mac so you can take advantage of all the hidden Unix goodness in your Macintosh. 

However … I’m a bit leery of Homebrew because this is how it installs itself: 

I mean, WHAT? curl a file and run it with bash? Seriously. Now, look, I’m not saying Homebrew isn’t safe - every indication is that it is - but that this METHOD of installation is a recipe for disaster. 

Why? Well, in case you’re not in the know, what this installation instruction is suggesting is to DOWNLOAD RANDOM CODE FROM SOMEWHERE ON THE INTERNET and RUN IT ON YOUR COMPUTER WITHOUT CHECKING IT AT ALL!

Nothing can go wrong with this plan.

Now, I’m no expert, but I’m familiar enough with this stuff to know what I’m doing. SO, first I checked with a few quick searches to see [is homebrew for mac safe] and it appeared to be.

SO I downloaded the software with JUST the CURL part, like so:

curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh > homebrew-install.sh

... so I could examine it more closely.

Folks, seriously, never do this on a site you do not trust.

After I had the code, I then inspected this homebrew-install.sh file to find out whether it was safe. I didn't see any obvious malware, but when I ran it, it wanted me to TYPE MY PASSWORD.

Seriously?

Please, I’m asking you, do not hot-pipe random software straight off the internet and run it straight from the command line and give it your password when it asks. If someone intercepts the website, and gets your password, they can do anything.

(SERIOUSLY. Once I was working with a legitimate Google representative about a Google ads program and when I went to log in to Google ads to check something, a hacker injected a fake Google ads site between me and Google, and damn near got my password. Only two-factor authentication saved me, as it broke some key link in the chain.)

BUT … it is the PATTERN I’m talking about here, not the specifics. Everything I’ve seen about Homebrew says that it is safe. I’ve even used it before, on other machines. SO, after some more research, and a little more code analysis, I confirmed this password-asking was safe, and gingerly went ahead.

And it went fine. 

I had to pay thirty million bitcoin to a Russian spammer, but I wasn’t using it anyway, and I’m sure at least they got to buy a cup of coffee or something with it. :-D

Seriously. It went fine. And I love Homebrew. I just go through this every time I need to “bash” run a piece of “curl”-ed software straight off the Internet and then it asks for my password.

Still, tree worked like a charm. (Screenshots of its use were above). There are more pieces of Homebrew software I need to install, but as one test, I tried to install “banner”, a program to create oversized pieces of text, which I use in scripts to alert me that a big task is done.

But, it seems like Mac already has a version of banner, which works differently on Mac than Linux, printing VERY large ASCII banners that are vertical rather than horizontal. That’s useful, but not for my case, so I dug around for an equivalent tool.  brew install figlet is the way to go:

All great! 

It didn’t help me with my work on the Embodied AI website, as I had already moved on to fixing other problems on that website, and was only “brewing” things in the background while I did other tasks (like remote-attend the church vestry retreat).

But removing this friction will help me in the future. The next time I need to examine the tree structure of a directory, it's one command away. I can put banners in my scripts. And I can easily add new software with 'brew' the next time it comes up.

AND, as a bonus, I discovered a site which is doing something very much like what I want to do with the Making Computers Useful series, Sourabh Bajaj’s Mac OS Setup Guide, which “... covers the basics of setting up a development environment on a new Mac.” I have an internal document called “Mac OS X New System Tasks” which documents for myself the travails I go through every time I get a new Mac, and Sourabh’s guide seems like it provides a public version of what I want to do. Which is great! Less work for me. ;-D

On to the next task!

-the Centaur

P.S. As another added bonus, I composed this in Google Docs, and pasted it straight into Gutenberg, the new Wordpress block editor. It worked like a charm ... EVEN DOWN TO PASTING IN THE IMAGES! If this is a feature of Gutenberg, I will have to consider saying my favorite three words about it ... "I was wrong."

P.P.S. Don't hold your breath on that, though, I'm waiting for the other shoe to drop.

Ripping Off the Bandaid

centaur 1

After almost seventeen years at Google, I've made the difficult decision to get laid off with no warning. :-) Working with Google was an amazing experience, from search to robotics to 3D objects and back to robotics again. We did amazing things and I am proud of all my great colleagues and what we accomplished together.

However, my work in robotics is not done, and I will still be pushing for better robot navigation, large language model planning, and especially social robot navigation and embodied AI. I'm spinning up an independent consulting business and will announce more details on this as it evolves - feel free to reach out directly though!

-the Centaur

P.S. Sorry for the delay - this has been up on my Linkedin forever. But for some reason I just wasn't ready to post this here. Avoidance behavior, however, has gone on long enough. Time to move on.

Pictured: me and Ryan at Sports Page, the traditional hangout you go to on your last day at Google. It was a blast seeing all the friends, thank you for coming!

The Embodied AI Workshop is Tomorrow, Sunday, June 20th!

centaur 0
embodied AI workshop

What happens when deep learning hits the real world? Find out at the Embodied AI Workshop this Sunday, June 20th! We’ll have 8 speakers, 3 live Q&A sessions with questions on Slack, and 10 embodied AI challenges. Our speakers will include:

  • Motivation for Embodied AI Research
    • Hyowon Gweon, Stanford
  • Embodied Navigation
    • Peter Anderson, Google
    • Aleksandra Faust, Google
  • Robotics
    • Anca Dragan, UC Berkeley
    • Chelsea Finn, Stanford / Google
    • Akshara Rai, Facebook AI Research
  • Sim-2-Real Transfer
    • Sanja Fidler, University of Toronto, NVIDIA
      Konstantinos Bousmalis, Google

You can find us if you’re signed up to #cvpr2021, through our webpage embodied-ai.org or at the livestream on YouTube.

Come check it out!

-the Centaur

The Embodied AI Workshop at CVPR 2021

centaur 0
embodied AI workshop

Hail, fellow adventurers: to prove I do something more than just draw and write, I'd like to send out a reminder of the Second Embodied AI Workshop at the CVPR 2021 computer vision conference. In the last ten years, artificial intelligence has made great advances in recognizing objects, understanding the basics of speech and language, and recommending things to people. But interacting with the real world presents harder problems: noisy sensors, unreliable actuators, incomplete models of our robots, building good simulators, learning over sequences of decisions, transferring what we've learned in simulation to real robots, or learning on the robots themselves.

interactive vs social navigation

The Embodied AI Workshop brings together many researchers and organizations interested in these problems, and also hosts nine challenges which test point, object, interactive and social navigation, as well as object manipulation, vision, language, auditory perception, mapping, and more. These challenges enable researchers to test their approaches on standardized benchmarks, so the community can more easily compare what we're doing. I'm most involved as an advisor to the Stanford / Google iGibson Interactive / Social Navigation Challenge, which forces robots to maneuver around people and clutter to solve navigation problems. You can read more about the iGibson Challenge at their website or on the Google AI Blog.

the iGibson social navigation environment

Most importantly, the Embodied AI Workshop has a call for papers, with a deadline of TODAY.

Call for Papers

We invite high-quality 2-page extended abstracts in relevant areas, such as:

  •  Simulation Environments
  •  Visual Navigation
  •  Rearrangement
  •  Embodied Question Answering
  •  Simulation-to-Real Transfer
  •  Embodied Vision & Language

Accepted papers will be presented as posters. These papers will be made publicly available in a non-archival format, allowing future submission to archival journals or conferences.

The submission deadline is May 14th (Anywhere on Earth). Papers should be no longer than 2 pages (excluding references) and styled in the CVPR format. Paper submissions are now open.

I assume anyone submitting to this already has their paper well underway, but this is your reminder to git'r done.

-the Centaur

More on why your computer needs a hug

centaur 0

Thanks to the permission of IGI, the publisher of the Handbook of Synthetic Emotions and Sociable Robotics, the full text of "Emotional Memory and Adaptive Personalities" is now available online. I've blogged about this paper previously here and elsewhere, but now that I've got permission, here's the full abstract:

Emotional Memory and Adaptive Personalities
by Anthony Francis, Manish Mehta and Ashwin Ram

Believable agents designed for long-term interaction with human users need to adapt to them in a way which appears emotionally plausible while maintaining a consistent personality. For short-term interactions in restricted environments, scripting and state machine techniques can create agents with emotion and personality, but these methods are labor intensive, hard to extend, and brittle in new environments. Fortunately, research in memory, emotion and personality in humans and animals points to a solution to this problem. Emotions focus an animal’s attention on things it needs to care about, and strong emotions trigger enhanced formation of memory, enabling the animal to adapt its emotional response to the objects and situations in its environment. In humans this process becomes reflective: emotional stress or frustration can trigger re-evaluating past behavior with respect to personal standards, which in turn can lead to setting new strategies or goals. To aid the authoring of adaptive agents, we present an artificial intelligence model inspired by these psychological results in which an emotion model triggers case-based emotional preference learning and behavioral adaptation guided by personality models. Our tests of this model on robot pets and embodied characters show that emotional adaptation can extend the range and increase the behavioral sophistication of an agent without the need for authoring additional hand-crafted behaviors.

And so this article is self-contained, here's the tired old description of the paper I've used a few times now:

"Emotional Memory and Adaptive Personalities" reports work on emotional agents supervised by my old professor Ashwin Ram at the Cognitive Computing Lab. He's been working on emotional robotics for over a decade, and it was in his lab that I developed my conviction that emotions serve a functional role in agents, and that to develop an emotional agent you should not start with trying to fake the desired behavior, but instead by analyzing psychological models of emotion and then using those findings to design models for agent control that will produce that behavior "naturally". This paper explains that approach and provides two examples of it in practice: the first was work done by myself on agents that learn from emotional events, and the second was work by Manish Mehta on making the personalities of more agents stay stable even after learning.

-the Centaur

Pictured is R1D1, one of the robot testbeds described in the article.