Overview

HUAWEI ML Kit provides the product visual search service. This service searches for the same or similar products in the pre-established product image library based on a product photo taken by a user, and returns the IDs of those products and related information, which will help you develop apps.

What You Will Create

You will create a product visual search app.

What You Will Learn

Hardware Requirements

Software Requirements

Download the following demo project package:

Download source code

Decompress the downloaded package to a local directory, then use Xcode to open it.

  1. Click the agconnect-services.plist file in the directory of the demo project.
  2. Go to Root > service > ml and set mlservice_url to the URL of the environment used for detection, for details, please refer to README.
  3. Go to Root > client and set api_key.
  4. Ensure your phone is correctly connected to the computer.

A certificate is required for debugging on a real device. Therefore, ensure that the certificate matches the provisioning profile before debugging.

Click the demo project name under TARGETS, go to Signing & Capabilities > Debug, expand Signing, and select Automatically manage signing.

If you have a debug certificate and provisioning profile (corresponding to an enterprise account or a personal account), set Team and Bundle Identifier. If the certificate, provisioning profile, and bundle identifier do not match, an error will be displayed and you cannot debug on a real device. The following figure shows an incorrect example.

Does it mean you cannot debug without a certificate and provisioning profile? The answer is no.

Apple has considered this. As long as you have an Apple ID, you can debug for free.

Go to Xcode > Preferences, click the Accounts tab, click ➕, choose Apple ID, and enter your account and password as prompted. Then you can configure the certificate under Signing.

Set Bundle Identifier (which is unique) and Team. Xcode will automatically generate a certificate and provisioning profile for you.

The following figure shows the errors you may encounter.

The possible cause is that the entered bundle identifier is already in use. In this case, you need to enter another bundle identifier.

Note that Apple sets a limit on the number of free debugging times. You can use another Apple ID to debug if the number of debugging times reaches the upper limit.

After the configuration, you can press Command + B to compile the demo project. If no error is reported, the compilation is successful.

Skip this step if the permissions have been added.
In the demo project directory, right-click the info.plist file, choose Open As > Source Code. On the page that is displayed, add the following code:

<key>NSCameraUsageDescription</key> <string>App needs your consent to access the camera</string> <key>NSPhotoLibraryUsageDescription</key> <string>App needs your consent to access the album</string>

In this way, camera and album permissions are added.

Set detection parameters in the requestImage method in the MLDocumentViewController.m file.

Sample code:

MLRemoteProductVisionSearchAnalyzerSetting *setting = [[MLRemoteProductVisionSearchAnalyzerSetting alloc] init]; setting.maxResult = 20; //setting.productSetId = @"**********"; [MLRemoteProductVisionSearchAnalyzer setRemoteProductVisionSearchAnalyzerSetting:setting];

Call the detection API and set the display of results in the requestImage method in the MLDocumentViewController.m file.

Sample code:

[MLRemoteProductVisionSearchAnalyzer asyncAnalyseImage:self.selectedImage addOnSuccessListener:^(MLProductVisionSearch * _Nonnull productModel) { // Display the result after the successful detection. NSTimeInterval codeTime = -[startTime timeIntervalSinceNow]; self->navView.timeShowLabel.text = [NSString stringWithFormat:@"%.4f S",codeTime]; self->list.selectedImage = self.selectedImage; self->list.dataArr = productModel; if(productModel.productList&&productModel.productList.count>0){ MLProductSearchView *searchView = [[MLProductSearchView alloc] initWithFrame:self.fullImageview.frame box:@[@(productModel.border.left_top_x),@(productModel.border.left_top_y),@(productModel.border.right_bottom_x),@(productModel.border.right_bottom_y)]]; searchView.imageSize = [self calculateTextScale:self.selectedImage]; [self.view addSubview:searchView]; searchView.backgroundColor = [UIColor clearColor]; } UIView *maskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, kMLDocumentDeviceWidth, kMLDocumentDeviceHeight)]; maskView.backgroundColor = [UIColor colorWithWhite:0.0 alpha:0.3]; maskView.userInteractionEnabled = YES; UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapClick:)]; [maskView addGestureRecognizer:tapGesture]; [self.view addSubview:maskView]; [self.view addSubview:self->list]; [self addImageView]; } addOnFailureListener:^(NSInteger errCode, NSString * _Nonnull errMsg) { // Display the result after the successful detection. NSTimeInterval codeTime = -[startTime timeIntervalSinceNow]; self->navView.timeShowLabel.text = [NSString stringWithFormat:@"%.4f S",codeTime]; }]

Running the App

Click on the toolbar of Xcode.

Try Now

Take a photo or select an image from the album for detection.

Well done. You have successfully completed this codelab and learned how to:

For more information about HUAWEI ML Kit, please visit our official website https://developer.huawei.com/consumer/en/hms/huawei-mlkit.

This project is only for demonstration, and the actual development process should strictly follow the development guide. For details, please refer to https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/service-introduction-0000001050040017.

Code copied