Google hired Pro Photographers to train its AI Camera “Google Clips”

Google Clips

Along with the launch of Google Pixel 2 back in October 2017, Google introduced Google Clips, an AI-powered camera. Just a week ago it appeared that Google will soon to make its lifelogging camera available for sale.

So probably just before hitting the stores, Google wants its customers to know what made possible for them to train its artificially intelligent camera to learn between good and bad photographs. Or how did they make the camera decide to take best shot from events happening in its frame.

Google recently published a blog post revealing that it’s a documentary filmmaker, a photojournalist, and a fine art photographer who helped its engineers to train its camera. They had to feed in with example photographs for machine learning algorithm to learn about good or bad photos. Take, for an example, a scene where user’s hand is covering the lens or a frame has blur instance – it’s a bad scene or moment to capture.

“By ruling out the stuff the camera wouldn’t need to waste energy processing (because no one would find value in it), the overall baseline quality of captured clips rose significantly.” said Josh Lovejoy in the blog post. He’s a Senior Interaction Designer at Google. “We needed to train models on what bad looked like: hands in front of the camera, quick and shaky movements, blurriness.” he added.

Bad scene example – Courtesy Google

Not only that, Google also trained the models about “stability, sharpness, and framing.” The engineers had to work with creating elements and situations about basic principles of photography. They created conversions about “focus, depth-of-field, rule-of-thirds, dramatic lighting, match cuts, storytelling.” However they also learnt that no one should “should never underestimate the profound human capability to wield common sense.”

At the end Lovejoy tells “Success with Clips isn’t just about keeps, deletes, clicks, and edits (though those are important), it’s about authorship, co-learning, and adaptation over time. We really hope users go out and play with it.”

For detailed information on how Google’s AI-powered camera works, you can visit the source below. To learn about Google Clips itself, you should about read about Intel’s Movidius VPU which is running inside the camera.