Let's talk about Google Clips, it's a $250 smart camera that uses Artificial Intelligence to detect and capture important moments in your life. Now if you are a parent or a pet owner, you're probably familiar with the frustration of trying to capture great candid's of your kid's or pet's activities. Maybe you're familiar with having a ton of photos and videos of your family members, but you're not any one of them because you're always the one behind the camera. Now Google things clips can solve that frustration by providing a camera that can do all of the shooting for you.
The clip is a simple automatic point and shoot camera that's similar looking to a go pro, but it's considerably smaller. It has a fixed focus lens, with a 130-degree field of view, a single button, and a few LED but no display. It connects to an iPhone, Google Pixel, or Samsung Galaxy S7 or S8 through Bluetooth or WIFI direct, to download the images it captures and control it. But inside is what supposedly makes the Clips special. Inside it's running Google's people detection algorithms to recognize familiar faces and interesting activity and then capture the moments that you care about automatically. The Clips isn't actually recording video or sound, it's technically shooting a bunch of still images, at roughly 15 frames per second, which is then stitches into seven-second slips. From which you can edit or pull stills out. It's basically making high-resolution GIF'S out of the sequences of images. Now you can use the big button on the front of the camera to face a capture, or you can use the app on your phone to see what the camera is viewing, and take shots there. The whole point of Clips is to let the camera and Google's algorithms do all of the heavy liftings.
Now to facilitate this, the Clips comes with the silicone case, that makes it easy to prop up almost anywhere and you can clip it to things. But it's not designed to be a body camera. You're supposed to just set it down and leave it alone. You can adjust the frequency of captures in Clips App and you can also train it, with people that matter to you, by linking it with your Google photos account. The Clips camera is supposed to learn the faces of important people, by who it's exposed to most often. By using the photos data, it's supposed to speed things along. You can also push the button on the front of the camera to take a direct portrait photo of someone that you want the clips to prioritize. Now once the camera has captured a bunch of clips, you use the app to browse through them on your phone. You can edit them down to shorter versions, grab still images out of them or just save the whole thing to your phone storage for sharing and editing later.
So Google's definitely onto something here. I hope it's eventually able to make it easier to capture pictures of kids and pets in the future.
Comments
Post a Comment