Skip to content
Jakob E. Bardram edited this page Feb 14, 2022 · 116 revisions

This is the wiki pages for the CACHET Research Platform (CARP) Mobile Sensing (CAMS) software. An overview and access to the software libraries are listed in the main Github site.

NOTE - CAMS have been released in a null-safe version 0.30.0 and all parts of this wiki has not yet been updated according to this. The main difference is that now CAMS is using the carp_core domain model. This wiki will be updated ASAP.....

CARP Mobile Sensing Framework in Flutter

The CARP Mobile Sensing (CAMS) Flutter package carp_mobile_sensing is a programming framework for adding digital phenotyping capabilities to your mobile (health) app. CAMS is designed to collect research-quality sensor data from the smartphone on-board sensors and attached off-board wearable devices. These wiki pages contains an overall documentation of the main software architecture and domain model of the framework, and how to use and extend it, plus a set of appendices providing details on measure types, data formats, data backends, and sampling schemas.

Available Articles

  1. Software Architecture – the overall picture.
  2. Domain Model – the detailed picture of the data model(s).
  3. Using CARP Mobile Sensing – how to use the framework in your app.
  4. Extending CARP Mobile Sensing – how to extends the framework, with a focus on new sensing capabilities and external (wearable) devices.
  5. Best Practice – tips and trick, especially related to differences between Android and iOS.

Appendices

Purpose and Goals

The overall goal of CAMS is to have a programming framework that helps building custom mobile sensing applications.

The basic scenario we want to support is to allow programmers to design and implement a custom mHealth app e.g. for cardiovascular diseases or diabetes. Such an app would have its main focus on providing application-specific functionality for patients with either hearth rhythm problems or diabetes, which are two rather distinct application domains.

However, the goal of a mobile sensing programming framework would be to enable the programmers to add mobile sensing capabilities in a 'flexible and simple' manner. This would include adding support for collecting data like ECG, location, activity, and step counts; to format this data according to different health data formats (like the Open mHealth formats); to use this data in the app (e.g. showing it to the user); and to upload it to a specific server, using a specific API (e.g. REST), in a specific format. The framework should be able to support different wearable devices for ECG or glucose monitoring. Hence, focus is on software engineering support in terms of a sound programming API and runtime execution environment, which is being maintained as the underlying mobile phone operating systems are evolving. Moreover, focus is on providing an extensible API and runtime environment, which allow for adding application-specific data sampling, wearable devices, data formatting, data management, and data uploading functionality.

Initial Example

The code listing below shows a simple Dart example of how to add sampling to a Flutter app. This basic example illustrates how sampling is configured, deployed, initialized and used in four basic steps;

  1. a study protocol is defined;
  2. the protocol is deployed,
  3. the runtime environment is created, initialized and started, and
  4. the stream of sampling events is consumed and used in the app.
import 'package:carp_core/carp_core.dart';
import 'package:carp_mobile_sensing/carp_mobile_sensing.dart';

/// This is an example of how to set up a study by using the `common`
/// sampling schema. Used in the README file.
void example() async {
  // create a study protocol
  StudyProtocol protocol = StudyProtocol(
    ownerId: 'AB',
    name: 'Track patient movement',
  );

  // define which devices are used for data collection
  // in this case, its only this smartphone
  Smartphone phone = Smartphone();
  protocol.addMasterDevice(phone);

  // Add an automatic task that immediately starts collecting step counts,
  // ambient light, screen activity, and battery level - using the
  // SamplingPackageRegistry 'common' factory method.
  protocol.addTriggeredTask(
      ImmediateTrigger(),
      AutomaticTask()
        ..addMeasures(SamplingPackageRegistry().common.getMeasureList(
          types: [
            SensorSamplingPackage.PEDOMETER,
            SensorSamplingPackage.LIGHT,
            DeviceSamplingPackage.SCREEN,
            DeviceSamplingPackage.BATTERY,
          ],
        )),
      phone);

  // deploy this protocol using the on-phone deployment service
  StudyDeploymentStatus status =
      await SmartphoneDeploymentService().createStudyDeployment(protocol);

  String studyDeploymentId = status.studyDeploymentId;
  String deviceRolename = status.masterDeviceStatus.device.roleName;

  // create and configure a client manager for this phone
  SmartPhoneClientManager client = SmartPhoneClientManager();
  await client.configure();

  // create a study runtime to control this deployment
  SmartphoneDeploymentController controller =
      await client.addStudy(studyDeploymentId, deviceRolename);

  // deploy
  await controller.tryDeployment();

  // configure the controller and resume sampling
  await controller.configure();
  controller.resume();

  // listening and print all data events from the study
  controller.data.forEach(print);
}

In essence, the core logic of CAMS can be broken up in three independent parts:

1. Create and deploy a study protocol

This entails:

  1. create (or load) a StudyProtocol
  2. deploy the protocol in a DeploymentService like the CAMS-specific SmartphoneDeploymentService.

2. Load a study deployment for the smartphone and execute the sensing

This entails:

  1. creating and configuring a ClientManager like the CAMS-specific SmartPhoneClientManager
  2. create a SmartphoneDeploymentController for executing the deployment
  3. if needed, load and registers external SamplingPackages
  4. configure and resume the SmartphoneDeploymentController

3. Use the data in the app

If the generated data is to be used in the app (e.g., shown in the interface), listen in on the data stream


Section 3 on Using CARP Mobile Sensing contains more details on this flow. But before digging in to this, read section 2 on the Domain Model. Section 1 provide some technical details on the CAMS Architecture, but this should not be necessary to read before using CAMS. Section 4 provides details on how to extend CAMS.

Clone this wiki locally