Google showed off a new AR Translate feature for Lens in September that uses the same technology as the Magic Eraser on the Pixel. Before that, Google Lens has taken the place of the translator’s built-in camera in Google Translate.
Google Lens is good at lifting text for real-world copy and paste, in addition to visual search, which has various shopping, object, and landmark identification use cases. The Translate filter, allows you to overlay your translation over the scene’s foreign text. It provides better preserves context and Text capability. This can likewise work disconnected assuming you download the language pack quite a bit early.
Add Realmicentral to your Google News feed.
Modifications Google Lens brings:
A camera tool has been available in the Google Translate mobile apps for a long time. In 2019 it was last updated as auto-detect and support for more languages. Last year the UI of the Android app was updated with the larger Material You redesign.
Google Len’s filter is now taking the place of the native Translate feature because of the overlap between the camera tools. In both of the Translate mobile apps, tapping the camera only opens the Lens UI.
This activates the system-level capability on Android, whereas the iOS app now includes a Lens instance. You can only use the “Translate” filter when launching from Google Translate and cannot use any other Lens features. You can manually change the languages, turn on clash, and “Show original text” at the top. In the bottom-left corner, you can import images and screenshots from your device.
This update has already been implemented in Android and iOS versions of Google Translate.
Compared to AR Translate, which features major advancements in AI, this consolidation makes sense. The current method overlays converted to text on top of the image using color blocks.
Google Lens will use the Pixel’s Magic Eraser technology and remove distractions from images. Also, the interpreted text will match the first style.
Lastly, AR Translate, which will be available in the Google Lens camera live and on screenshots in 100 milliseconds, will be available later this year.