Home / Uncategorized / Who’s a good AI? Dog-based data creates a canine machine learning system

Who’s a good AI? Dog-based data creates a canine machine learning system

We’ve trained machine learning systems to identify objects, navigate streets, and recognize facial expressions, but as difficult as they may be, they don’t even touch the level of sophistication required to simulate, for example, a dog. Well, this project aims to do just that — in a very limited way, of course. By observing the behavior of A Very Good Girl, this AI learned the rudiments of how to act like a dog.

It’s a collaboration between the University of Washington and the Allen Institute for AI, and the resulting paper will be presented at CVPR in June.

Why do this? Well, although much work has been done to simulate the sub-tasks of perception like identifying an object and picking it up, little has been done in terms of “understanding visual data to the extent that an agent can take actions and perform tasks in the visual world.” In order words, act not as the eye, but as the thing controlling the eye.

And why dogs? Because they’re intelligent agents of sufficient complexity, “yet their goals and motivations are often unknown a priori.” In other words, dogs are clearly smart, but we have no idea what they’re thinking.

As an initial foray into this line of research, the team wanted to see if by monitoring the dog closely and mapping its movements and actions to the environment it sees, they could create a system that accurately predicted those movements.

In order to do so, they loaded up a malamute named Kelp M. Redmon with a basic suite of sensors. There’s a GoPro camera on Kelp’s head, six inertial measurement units (on the legs, tail, and trunk) to tell where everything is, a microphone, and an Arduino that tied the data together.

They recorded many hours of activities — walking in various environments, fetching things, playing at a dog park, eating — syncing the dog’s movements to what it saw. The result is the Dataset of Ego-Centric Actions in a Dog Environment, or DECADE, which they used to train a new AI agent.

This agent, given certain sensory input — say a view of a room or street, or a ball flying past it — was to predict what a dog would do in that situation. Not to any serious level of detail, of course — but even just figuring out how to move its body and where to is a pretty major task.

“It learns how to move the joints to walk, learns how to avoid obstacles when walking or running,” explained Hessam Bagherinezhad, one of the researchers, in an email. “It learns to run for the squirrels, follow the owner, track the flying dog toys (when playing fetch). These are some of the basic AI tasks in both computer vision and robotics that we’ve been trying to solve by collecting separate data for each task (e.g. motion planning, walkable surface, object detection, object tracking, person recognition).”

That can produce some rather complex data: for example, the dog model must know, just as the dog itself does, where it can walk when it needs to get from here to there. It can’t walk on trees, or cars, or (depending on the house) couches. So the model learns that as well, and this can be deployed separately as a computer vision model for finding out where a pet (or small legged robot) can get to in a given image.

This was just an initial experiment, the researchers say, with success but limited results. Others may consider bringing in more senses (smell is an obvious one) or seeing how a model produced from one dog (or many) generalizes to other dogs. They conclude: “We hope this work paves the way towards better understanding of visual intelligence and of the other intelligent beings that inhabit our world.”

Check Also

Don’t just stir; Stircle

Although I do my best to minimize the trash produced by my lifestyle (blog posts notwithstanding), one I can’t really control, at least without carrying a spoon on my person at all times, is the necessity of using a disposable stick to stir my coffee. That could all change with the Stircle, a little platform that spins your drink around to mix it. Now, of course this is ridiculous. And there are other things to worry about. But honestly, the scale of waste here is pretty amazing. Design house Amron Experimental says that 400 million stir sticks are used every day, and I have no reason to doubt that. My native Seattle probably accounts for a quarter of that. So you need to get the sugar (or agave nectar) and cream (or almond milk) mixed in your iced americano. Instead of reaching for a stick and stirring vigorously for ten or fifteen seconds, you could instead place your cup in the Stircle (first noticed by New Atlas and a few other design blogs), which would presumably be built into the fixins table at your coffee shop. Around and around and around she goes, where she stops, nobody… oh. There. Once you put your cub on the Stircle, it starts spinning — first one way, then the other, and so on, agitating your drink and achieving the goal of an evenly mixed beverage without using a wood or plastic stirrer. It’s electric, but I can imagine one being powered by a lever or button that compresses a spring. That would make it even greener. The video shows that it probably gets that sugar and other low-lying mixers up into the upper strata of the drink, so I think we’re set there. And it looks as though it will take a lot of different sizes, including reusable tumblers. It clearly needs a cup with a lid, since otherwise the circling liquid will fly out in every direction, which means you have to be taking your coffee to go. That leaves out pretty much every time I go out for coffee in my neighborhood, where it’s served (to stay) in a mug or tall glass. But a solution doesn’t have to fix everything to be clever or useful. This would be great at an airport, for instance, where I imagine every order is to go. Maybe they’ll put in in a bar, too, for extra smooth stirring of martinis. Actually, I know that people in labs use automatic magnetic stirrers to do their coffee. This would be a way to do that without appropriating lab property. Those things are pretty cool too, though. You might remember Amron from one of their many previous clever designs; I happen to remember the Keybrid and Split Ring Key, both of which I used for a while. I’ll be honest, I don’t expect to see a Stircle in my neighborhood cafe any time soon, but I sure hope they show up in Starbucks stores around the world. We’re going to run out of those stirrer things sooner or later.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.