Nowadays artificial intelligence and the object recognition. On Android, the first name that comes to mind when thinking about this is Google Lens, although some manufacturers have developed their own version built into their customization layer. At Huawei, it is called HiVision and is integrated into the camera application.
HiVision from EMUI is something like Huawei’s Google Lens, and is an object recognition system that can be used for real-time translations, reading QR codes, or identifying objects. It is similar to Google Lens although it does some more things, some less, as we will see next.
What is HiVision and how to access it
Let’s start at the beginning, what is this and how is it activated? HiVision is the recognition of images using the mobile camera, and it is accessed from the camera app, by tapping on its icon (see screenshot above). The position of this icon has changed from previous versions of EMUI, so you could have it at the bottom as one more mode. For quick access, you can also pull up from the lock screen and tap on the icon.
EMUI has a similar option called HiTouch which is almost the same but without using the camera, but making the recognition of what is shown on the mobile screen at that moment. It is an optional function (you can activate it in Accessibility Features> HiTouch and it does an object recognition by using two fingers at the same time on the mobile. This recognition serves only to search for purchases.
What you can do with HiVision
Now that we know what HiVision is, let’s see what you can do with it. Many of its functions coincide to a lesser or greater extent with functions of Google Lens, so that we will compare them to see who does the job better.
Some Google Lens features are not available in HiVision, at least for now. They are the selection of text using the camera or the special mode for restaurant menus. This is all that HiVision allows you to do and how it compares to its equivalent in Google Lens.
One of the most useful features of Google Lens is being able to read QR codes without using any additional applications. HiVision can do the exact same thing, and the system works just as well, admitting all kinds of QR codes including texts, websites, contacts and Wi-Fi connection settings.
Both Google Lens and HiVision are good at reading QR codes, so there are few differences between them. The main one is that in HiVision the result is displayed as soon as it is read, while in Google Lens you need to tap on the code to read it.
HiVision includes its own real-time translation using the mobile camera. Language support is more limited than in Google Translate or Lens, but includes Chinese, English, Japanese, Korean, Spanish, French, Russian, Italian, German, and Portuguese, with the ability to automatically detect the language.
In the supported languages the translator behaves, trying to replicate the style of the letter as best it can. It is a more than acceptable translation, although it feels a bit slower compared to Google Lens.
Both HiVision and Google Lens allow you to use the mobile camera to search that same product in online stores. The result is far from perfect in both cases, although it will not be difficult for you to find results by scanning everyday or relatively famous products where and as long as they are appreciated correctly.
The main difference here is not the recognition of the object, but the search for said object in stores. Google Shopping offers much more relevant results than those obtained by ViSenze, a China-based image recognition company.
Another common feature of HiVision and Google Lens is object recognition, regardless of what they are. That is, you point to an apple with your mobile and the mobile tells you -or should- that it is an apple. Here Huawei relies on Microsoft technology to do the recognition, and the result is regular.
Is about a more literal recognition based on other similar images, but that may have little or nothing to do with what you really want to analyze. Google Lens search is by no means foolproof (it doesn’t recognize pikachu!) But it frequently understands more abstract concepts and adds additional information about the object.
A HiVision function that is not present in Google Lens is to count calories, this time possible thanks to azumio. The idea is this: you point to what you are going to eat and HiVision detects what it is and shows you its nutritional information. Something like Yuka but more abstract, without relying on barcodes.
The system is the less curious and is almost more entertaining to see what comes out than for the nutritional information it provides. The truth is that works better than i expectedAlthough the most frustrating thing is the interface itself. Recognition comes and goes and the information screen appears or disappears constantly.
In summary, neither Google Lens nor HiVision are perfect, although both are useful or at least curious in some situations. Google Lens is generally more stable and its text selection tool is very useful, while HiVision performs well as a translator using the camera and includes a calorie counter, which never hurts.