Many of our favorite photos are very hard to get: they’re the candid, natural moments that happen in between the posed photos, like the fleeting yet adorable looks our pets or kids give us. Last October we shared that we’ve been working on Google Clips, a lightweight, hands-free camera that uses on-device machine learning to help you capture beautiful and spontaneous moments of family, friends, pets, and yourself. Simply turn the camera on and it will capture and edit clips of these moments, while letting you join in as well.
Starting today, Clips is available in the U.S. for $249 from the Google Store, Best Buy, B&H and Verizon.
Clips isn’t designed to replace your smartphone camera or your DSLR. It’s a new type of camera that captures the moments that happen in between posed pictures by using on-device machine learning to look for great facial expressions from the people—and pets—in your life. It turns these into short clips without you having to use video editing software. Clips comes with a companion app on Android or iOS that lets you share your content with friends or other apps. You can also pick any frame from these clips to save as a high-resolution still photo.
Designed for privacy and control
From day one working on Clips, we knew privacy and control were extremely important, and we’ve been careful to design and engineer Clips to uphold those principles:
- It looks like a camera and has an indicator light, so everyone around knows what it does and when it’s on. It also works best when it’s less than 10 feet away from what it’s capturing so you can see where it is in the room.
- It doesn’t need a data connection to function, nor does it require an account. We miniaturized machine learning models to run locally on the device.
- Just like a traditional point-and-shoot camera, none of your clips leave your device until you decide to save or share them.
This article was sourced from Google Blog