Service Scenario Description

MediScale monitors user's health on various parameters to ensure optimal respiratory and cardiac health. The fall detection system monitors user's movement and triggers an alarm to inform all the saved contacts in case of a fall injury. Besides, Map Kit and Site Kit are used to find and mark the nearest hospitals during an emergency, and ML Kit (Image Classification) is used to detect nutritional information on scanning the camera.

Features

HMS Kits

Displaying nearest hospitals

Map Kit

Finding hospitals on map

Site Kit

Recognizing users activity

Location Kit

Extracting nutritional information on scanning food objects

ML Kit (Image Classification)

Obtaining current location

Location Kit

What You Will Create

In this code lab, you will create a demo project and use the APIs of Location, Map, Site and ML kits, you can explore the following processes in the demo project:

What You Will Learn

In this code lab, you will learn how to:

Hardware Requirements

Software Requirements

Reference:
Integration preparations
Map Kit integration
Site Kit integration
Location Kit integration
ML Kit integration

  1. Go to Project Setting > Manage APIs to enable the API permission for the following kits.

My Dashboard screen allows the user to select different options like calculating BMI to maintain optimal health, checking the number of covid cases around the area, recording daily-steps, adding default threshold data for fall detection and calibrating the sensors for fall detection service.

It also allows the user to set their primary and scondary contacts to alert when the app detects a fall.

The fall detection screen is the core part of the app and is used to monitor the movement of the user and trigger a warning message when the fall is detected.
The graph allows users to keep a track of their movements.

The body mass index is used to monitor or track user health to achieve better health with good heart rate. Here weight and height are the inputs and these values can be metric or imperial.

AQI and weather page allows the user to understand the environment before the user is going for any form of excercise outside. The AQI values form an important role for people with respiratory problems.

Class Diagram

Map Kit Integration

Huawei Map Kit provides powerful and convenient map services for you to implement personalized map display and interaction at ease. The Map SDK for Android is a set of APIs that can be called to develop maps. You can use this SDK to easily add map-related functions to your Android app, including map display, map interaction, map drawing, and map style customization.
Step 1: In the Nearest Hospital activity started by the app, call the initViews method to initialize the location services.
Java

fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this); settingsClient = LocationServices.getSettingsClient(this); mLocationRequest = new LocationRequest(); mLocationRequest.setInterval(10000); mLocationRequest.setPriority(LocationRequest.PRIORITY_HIGH_ACCURACY); mLocationCallback = new LocationCallback() { @Override public void onLocationResult(LocationResult locationResult) { if (locationResult != null) { List locations = locationResult.getLocations(); if (!locations.isEmpty()) { for (Location location : locations) { mCurrentLocation.setLatitude(location.getLatitude()); mCurrentLocation.setLongitude(location.getLongitude()); if (!isCalledHospitalApi && location.getAccuracy() < 35) { isCalledHospitalApi = true; callNearByHospitalApi(mRadius); } Log.i(TAG, "onLocationResult location[Longitude,Latitude,Accuracy]:" + location.getLongitude() + "," + location.getLatitude() + "," + location.getAccuracy()); } } } } @Override public void onLocationAvailability(LocationAvailability locationAvailability) { if (locationAvailability != null) { boolean flag = locationAvailability.isLocationAvailable(); Log.i(TAG, "onLocationAvailability isLocationAvailable:" + flag); } } };

Kotlin

fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this) settingsClient = LocationServices.getSettingsClient(this) mLocationRequest = LocationRequest() mLocationRequest!!.interval = Constants.INTERVAL_10000.toLong() mLocationRequest!!.priority = LocationRequest.PRIORITY_HIGH_ACCURACY mLocationCallback = object : LocationCallback() { override fun onLocationResult(locationResult: LocationResult) { if (locationResult != null) { val locations = locationResult.locations if (!locations.isEmpty()) { for (location in locations) { mCurrentLocation.latitude = location.latitude mCurrentLocation.longitude = location.longitude if (!isCalledHospitalApi && location.accuracy < Constants.MAX_ACCURACY) { isCalledHospitalApi = true callNearByHospitalApi(mRadius) } Log.d(TAG, getString(R.string.on_location_result) + location.longitude + Constants.STR_COMMA + location.latitude + Constants.STR_COMMA + location.accuracy) } } } } override fun onLocationAvailability(locationAvailability: LocationAvailability) { if (locationAvailability != null) { val flag = locationAvailability.isLocationAvailable Log.d(TAG, getString(R.string.on_location_availability_is_location_available) + flag) } } }

Step 2: Call the callNearByHospitalApi method to get the nearest hospital list and call the addHospitalMarkerToMap method to add the nearest hospital markers to the Huawei map.

Java

private void callNearByHospitalApi(int mRadius) { NearbySearchRequest mHospitalRequest = new NearbySearchRequest(); mHospitalRequest.setLocation(new Coordinate(mCurrentLocation.getLatitude(), mCurrentLocation.getLongitude())); mHospitalRequest.setRadius(mRadius * Constants.INIT_1000); mHospitalRequest.setPoiType(LocationType.HOSPITAL); mHospitalRequest.setLanguage(String.valueOf(Constants.LANGUAGE_EN)); SearchResultListener<NearbySearchResponse> mListener = new SearchResultListener<NearbySearchResponse>() { @Override public void onSearchResult(NearbySearchResponse nearbySearchResponse) { mSites = new ArrayList<>(); mSites = (ArrayList<Site>) nearbySearchResponse.getSites(); if (mSites != null) { for (int i = INIT_ZERO; i < mSites.size(); i++) { addHospitalMarkerToMap(mSites.get(i)); } } else { Log.d(TAG, getString(R.string.no_near_hospital)); } } @Override public void onSearchError(SearchStatus searchStatus) { } }; mSearchService.nearbySearch(mHospitalRequest, mListener); }

Kotlin

private fun callNearByHospitalApi(mRadius: Int) { val mHospitalRequest = NearbySearchRequest() mHospitalRequest.location = Coordinate(mCurrentLocation.latitude, mCurrentLocation.longitude) mHospitalRequest.radius = mRadius * Constants.INIT_1000 mHospitalRequest.poiType = LocationType.HOSPITAL mHospitalRequest.language = Constants.LANGUAGE_EN val mListener: SearchResultListener = object : SearchResultListener { override fun onSearchResult(nearbySearchResponse: NearbySearchResponse) { mSites = ArrayList() mSites = nearbySearchResponse.sites as ArrayList if (mSites != null) { for (i in Constants.INIT_ZERO until mSites!!.size) { addHospitalMarkerToMap(mSites!![i]) } } else { Log.d(TAG, getString(R.string.no_near_hospital)) } } override fun onSearchError(searchStatus: SearchStatus) {} } mSearchService!!.nearbySearch(mHospitalRequest, mListener) }

Map shows nearest hospital list.

ML Kit Integration

Step 1: Obtain the instance of the MLApplication and set the API key in the onCreate method of FoodDetectionActivity.

Java

@Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_food_detection); mPreviewView = findViewById(R.id.camera); captureImage = findViewById(R.id.btncapture); mDetectedFood = findViewById(R.id.tv_detectedFood); alertDialog = new Dialog(this); MLApplication.getInstance().setApiKey ("API_KEY"); if (allPermissionsGranted()) { startCamera(); //start camera if permission has been granted by user } else { ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS); } }

Kotlin

private var alertDialog: Dialog? = null override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_food_detection) mPreviewView = findViewById(R.id.camera) captureImage = findViewById(R.id.btncapture) mDetectedFood = findViewById(R.id.tv_detectedFood) alertDialog = Dialog(this) MLApplication.getInstance().apiKey = R.string.API_KEY.toString() if (allPermissionsGranted()) { startCamera() //start camera if permission has been granted by user } else { ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS) } }

Step 2: Start camera to capture the image, and save the image in the folder. In this way, food and food nutrition details can be detected.

Java

private void startCamera() { final ListenableFuture<ProcessCameraProvider> cameraProviderFuture = ProcessCameraProvider.getInstance(this); cameraProviderFuture.addListener(new Runnable() { @Override public void run() { try { ProcessCameraProvider cameraProvider = cameraProviderFuture.get(); bindPreview(cameraProvider); } catch (ExecutionException | InterruptedException e) { e.printStackTrace(); } } }, ContextCompat.getMainExecutor(this)); }

Kotlin

private fun startCamera() { val cameraProviderFuture = ProcessCameraProvider.getInstance(this) cameraProviderFuture.addListener(Runnable { try { val cameraProvider = cameraProviderFuture.get() bindPreview(cameraProvider) } catch (e: ExecutionException) { e.printStackTrace(); } catch (e: InterruptedException) { e.printStackTrace(); } }, ContextCompat.getMainExecutor(this)) }

Step 3: After saving the image to the folder, obtain the path of the image and pass it to MLObjectAnalyzerSetting to get the image classifications name.

Java

Bitmap myBitmap = BitmapFactory.decodeFile(file.getAbsolutePath()); MLObjectAnalyzerSetting setting = new MLObjectAnalyzerSetting.Factory() .setAnalyzerType(MLObjectAnalyzerSetting.TYPE_PICTURE) .allowMultiResults() .allowClassification() .create(); objectAnalyzer = MLAnalyzerFactory.getInstance() getLocalObjectAnalyzer(setting); final MLFrame frame = MLFrame.fromBitmap(myBitmap); Task<List<MLObject>> task = objectAnalyzer.asyncAnalyseFrame(frame); // Asynchronously process the result returned by the object detector. task.addOnSuccessListener(new OnSuccessListener<List<MLObject>>() { @Override public void onSuccess(List<MLObject> objects) { SparseArray<MLObject> objectSparseArray = objectAnalyzer.analyseFrame(frame); for (int i = 0; i < objectSparseArray.size(); i++) { if (objectSparseArray.valueAt(i).getTypeIdentity() == MLObject.TYPE_FOOD) { // Toast.makeText(CameraActivity.this, "It is FOOD", Toast.LENGTH_SHORT).show(); // IMAGE Classification ... MLRemoteClassificationAnalyzerSetting cloudSetting = new MLRemoteClassificationAnalyzerSetting.Factory() .setMinAcceptablePossibility(0.8f) .create(); cloudImageClassificationAnalyzer = MLAnalyzerFactory.getInstance(). getRemoteImageClassificationAnalyzer(cloudSetting); MLFrame frame = MLFrame.fromBitmap(myBitmap); Task<List<MLImageClassification>> task = cloudImageClassificationAnalyzer.asyncAnalyseFrame(frame); task.addOnSuccessListener(new OnSuccessListener<List<MLImageClassification>>() { @Override public void onSuccess(List<MLImageClassification> classifications) { dismissLoadingDialog(); // String result = ""; ArrayList<String> result = new ArrayList<>(); for (MLImageClassification classification : classifications) { result.add(classification.getName()); } Log.d(TAG, "onSuccess: " + result); StringBuilder detectedFood = new StringBuilder(); for (String details : result) { detectedFood.append(details).append(","); } mDetectedFood.setText(detectedFood.toString()); //Toast.makeText(FoodDetectionActivity.this, "" + result.get(0) + "," + result.get(result.size() - 1), Toast.LENGTH_LONG).show(); } }).addOnFailureListener(new OnFailureListener() { @Override public void onFailure(Exception e) { try { MLException mlException = (MLException) e; int errorCode = mlException.getErrCode(); String errorMessage = mlException.getMessage(); Toast.makeText(FoodDetectionActivity.this, "" + errorMessage, Toast.LENGTH_SHORT).show(); } catch (Exception error) { // Handle the conversion error. Toast.makeText(FoodDetectionActivity.this, "" + error.getMessage(), Toast.LENGTH_SHORT).show(); } } }); } else { dismissLoadingDialog(); Toast.makeText(FoodDetectionActivity.this, "NOT FOOD", Toast.LENGTH_SHORT).show(); } } } }).addOnFailureListener(new OnFailureListener() { @Override public void onFailure(Exception e) { dismissLoadingDialog(); // Detection failure. Toast.makeText(FoodDetectionActivity.this, "Detection Failed", Toast.LENGTH_SHORT).show(); } });

Kotlin

var myBitmap: Bitmap? = null try { myBitmap = BitmapFactory.decodeFile(finalFile.canonicalPath) } catch (e: IOException) { ExceptionHandling.PrintExceptionInfo(getString(R.string.exception_str_io), e) } val setting = MLObjectAnalyzerSetting.Factory() .setAnalyzerType(MLObjectAnalyzerSetting.TYPE_PICTURE) .allowMultiResults() .allowClassification() .create() objectAnalyzer = MLAnalyzerFactory.getInstance().getLocalObjectAnalyzer(setting) val frame = MLFrame.fromBitmap(myBitmap) val task = objectAnalyzer?.asyncAnalyseFrame(frame) // Asynchronously process the result returned by the object detector. val finalMyBitmap = myBitmap task?.addOnSuccessListener { val objectSparseArray = objectAnalyzer?.analyseFrame(frame) for (i in Constants.INIT_ZERO until objectSparseArray?.size()!!) { if (objectSparseArray?.valueAt(i)?.typeIdentity == MLObject.TYPE_FOOD) { // IMAGE Classification ... val cloudSetting = MLRemoteClassificationAnalyzerSetting.Factory() .setMinAcceptablePossibility(Constants.FLOAT_8F) .create() cloudImageClassificationAnalyzer = MLAnalyzerFactory.getInstance().getRemoteImageClassificationAnalyzer(cloudSetting) val frame = MLFrame.fromBitmap(finalMyBitmap) val task = cloudImageClassificationAnalyzer?.asyncAnalyseFrame(frame) task?.addOnSuccessListener { classifications -> dismissLoadingDialog() val result = ArrayList<String>() for (classification in classifications) { result.add(classification.name) } val detectedFood = StringBuilder() for (details in result) { detectedFood.append(details).append(Constants.STR_COMMA) } mDetectedFood!!.text = detectedFood.toString() }?.addOnFailureListener { e -> try { val mlException = e as MLException val errorCode = mlException.errCode val errorMessage = mlException.message Toast.makeText(this@FoodDetectionActivity, errorMessage, Toast.LENGTH_SHORT).show() } catch (error: Exception) { // Handle the conversion error. Toast.makeText(this@FoodDetectionActivity, error.message, Toast.LENGTH_SHORT).show() } } } else { dismissLoadingDialog() Toast.makeText(this@FoodDetectionActivity, getString(R.string.not_food), Toast.LENGTH_SHORT).show() } } }?.addOnFailureListener { dismissLoadingDialog() // Detection failure. Toast.makeText(this@FoodDetectionActivity, R.string.detection_failed, Toast.LENGTH_SHORT).show() }

Step 4: Implement the fall detection logic based on threshold boundary set by the user.

Java

double[] gravity_updated_values = new double[3]; double[] gravity = new double[3]; gravity_updated_values[ARRAY_INDEX_FIRST] = gravity[ARRAY_INDEX_FIRST]; gravity_updated_values[ARRAY_INDEX_SECOND] = gravity[ARRAY_INDEX_SECOND]; gravity_updated_values[ARRAY_INDEX_THIRD] = gravity[ARRAY_INDEX_THIRD]; gravity[ARRAY_INDEX_FIRST] = event.values[ARRAY_INDEX_FIRST]; gravity[ARRAY_INDEX_SECOND] = event.values[ARRAY_INDEX_SECOND]; gravity[ARRAY_INDEX_THIRD] = event.values[ARRAY_INDEX_THIRD]; double updatedAmount = Math.pow((gravity[ARRAY_INDEX_FIRST] - gravity_updated_values[ARRAY_INDEX_FIRST]), Constants.INIT_2) + Math.pow((gravity[ARRAY_INDEX_SECOND] - gravity_updated_values[ARRAY_INDEX_SECOND]), Constants.INIT_2) + Math.pow((gravity[ARRAY_INDEX_THIRD] - gravity_updated_values[ARRAY_INDEX_THIRD]), Constants.INIT_2); if (!firstChange && updatedAmount >= thresholdBoundary) { startActivity(new Intent(this, SOSActivity.class)); finish(); }

Kotlin

val gravity_updated_values = DoubleArray(3) val gravity = DoubleArray(3) gravity_updated_values[ARRAY_INDEX_FIRST] = gravity[ARRAY_INDEX_FIRST] gravity_updated_values[ARRAY_INDEX_SECOND] = gravity[ARRAY_INDEX_SECOND] gravity_updated_values[ARRAY_INDEX_THIRD] = gravity[ARRAY_INDEX_THIRD] gravity[ARRAY_INDEX_FIRST] = event.values[ARRAY_INDEX_FIRST].toDouble() gravity[ARRAY_INDEX_SECOND] = event.values[ARRAY_INDEX_SECOND].toDouble() gravity[ARRAY_INDEX_THIRD] = event.values[ARRAY_INDEX_THIRD].toDouble() val updatedAmount = Math.pow(gravity[ARRAY_INDEX_FIRST] - gravity_updated_values[ARRAY_INDEX_FIRST], Constants.INIT_2.toDouble()) + Math.pow(gravity[ARRAY_INDEX_SECOND] - gravity_updated_values[ARRAY_INDEX_SECOND], Constants.INIT_2.toDouble()) + Math.pow(gravity[ARRAY_INDEX_THIRD] - gravity_updated_values[ARRAY_INDEX_THIRD], Constants.INIT_2.toDouble()) if (!firstChange && updatedAmount >= thresholdBoundary) { startActivity(Intent(this, SOSActivity::class.java)) finish() }

During a false detection the user can stop the trigger or messages to be sent to the primary and secondary contact to raise a false alarm. The SOS screen gives 15 seconds for the user to respond before any form of alarm is raised.

Well done. You have successfully built a MediScale Health app and learned how to:

  1. Map Service - Map Kit
  2. Site Kit
  3. ML Kit
  4. Location Service - Location Kit

You can also click the button below to download the source code.
Download source code

Disclaimer: "This codelab is a reference to implement combination of multiple HMS kits in single project. The developer should verify and ensure the legal and security compliance of the relevant open source code".

Code copied