Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Active tasks requiring motion sensor data #1

Open
davwillev opened this issue Dec 2, 2021 · 4 comments
Open

Active tasks requiring motion sensor data #1

davwillev opened this issue Dec 2, 2021 · 4 comments
Assignees

Comments

@davwillev
Copy link

@MadsVSChristensen. I have just had a quick look and this package looks very exciting.

Continuing our discussion from Research Package, I noticed that all of the current activities/active tasks in CP do not seem to use motion sensor data (e.g. accelerometer, gyroscope, etc.) However, the tasks that I am interested in creating will need this. In the past, once I can get hold of sensor data, I can create the tasks relatively quickly.

From working on ResearchKit and ResearchStack previously, I know that there are big differences between how iOS and Android capture device motion. Is this something (e.g. a hook?) that will have to be built into RP or CP?

Thanks (and great work)!

@bardram
Copy link
Contributor

bardram commented Jan 29, 2022

Hi @davwillev - collection of sensor data is supported in the CARP Mobile Sensing framework (that I am responsible for). See

This framework is, however, designed for continuous, long-term sensing in the background,.

What is you exact use case?

@davwillev
Copy link
Author

davwillev commented Jan 31, 2022

Hi @bardram
There are several active tasks already implemented on RK that require sensor data (accelerometer, gyroscope, microphone, etc) to augment self-reported data. I think the most high profile use of this so far was a study to measure tremor and/or speech soundwaves to assess for Parkinson's symptoms, which I believe was very successful.

My personal interest is to use motion data to assess aspects of body movements (range, smoothness, etc). There are several 'walking' active tasks on RK which provide some useful data relating to gait. There are also some 'range of motion' tasks on RK (knee and shoulder), which I have contribbuted to and have since created versions of my own (back and neck).

It would be nice to reproduce all of these on RP. However, I thnk background sensing would also offer some excellent possibilities for research data.

@bardram bardram self-assigned this Mar 17, 2022
@bardram
Copy link
Contributor

bardram commented Mar 18, 2022

Hi @davwillev

There are several active tasks already implemented on RK that require sensor data (accelerometer, gyroscope, microphone, etc) to augment self-reported data. I think the most high profile use of this so far was a study to measure tremor and/or speech soundwaves to assess for Parkinson's symptoms, which I believe was very successful.

Thanks for the input. I am aware of the mPower study, but have not looked into the details and replicated this in CAMS. But it would be a good idea to see if this is doable, so we can see if CAMS can support such a study.

I have tried to implement a simple task on how it might look in CAMS. Here is the code in Flutter:

    protocol.addTriggeredTask(
        PeriodicTrigger(
          period: Duration(minutes: 2),
          duration: const Duration(seconds: 2),
        ),
        AppTask(
          type: SurveyUserTask.COGNITIVE_ASSESSMENT_TYPE,
          title: "Parkinsons' Assessment",
          description: "A simple task assessing finger tapping speed.",
          minutesToComplete: 3,
        )
          ..measures.add(RPTaskMeasure(
            type: SurveySamplingPackage.SURVEY,
            surveyTask:
                RPOrderedTask(identifier: "parkinsons_assessment", steps: [
              RPInstructionStep(
                  identifier: 'parkinsons_instruction',
                  title: "Parkinsons' Disease Assessment",
                  text:
                      "In the following pages, you will be asked to solve two simple test which will help assess your symptoms on a daily basis. "
                      "Each test has an instruction page, which you should read carefully before starting the test.\n\n"
                      "Please sit down comfortably and hold the phone in one hand while performing the test with the other."),
              RPFlankerActivity(
                'flanker_1',
                lengthOfTest: 30,
                numberOfCards: 10,
              ),
              RPTappingActivity(
                'tapping_1',
                lengthOfTest: 10,
              )
            ]),
          ))
          ..measures.add(Measure(type: SensorSamplingPackage.ACCELEROMETER))
          ..measures.add(Measure(type: SensorSamplingPackage.GYROSCOPE)),
        phone);

This is taken from the Pulmonary Monitor app, which allow us to set up a set of tasks for the user to do.

What this code does is basically to:

  • create a user task for the user to do, which consist of three steps
    1. an instruction
    2. a Flanker test
    3. a Finger Tapping test
  • and while the user is performing these tests, accelerometer and gyroscope data is collected.

I hope this helps understand how CAMS and Cognition Package works together.

@davwillev
Copy link
Author

davwillev commented Mar 24, 2022

Hi @bardram

This example makes it look fairly straightforward to implement sensor data, which is good to know.

To match RK's functionality and architecture, and to make future active tasks easy to add, we would probably have to reproduce the Device Motion Recorder, which provides a broad range of processed sensor data and can easily be called within any active task requiring motion sensor data.
Documentation: http://researchkit.org/docs/Classes/ORKDeviceMotionRecorderConfiguration.html
Code: https://github.com/ResearchKit/ResearchKit/blob/main/ResearchKit/ActiveTasks/ORKDeviceMotionRecorder.h
https://github.com/ResearchKit/ResearchKit/blob/main/ResearchKit/ActiveTasks/ORKDeviceMotionRecorder.m

This is one of several sucbclasses of the ORKRecoder class within RK. These data recorders are crucial for the structure of the active tasks and how they behave. Importantly, all of the recorders automatically record the data in JSON (see dependencies within ORKRecorder) and allow a datafile to be saved (which is arguably the most useful output for upload and analysis).

The active task's StepViewController accesses the data during these recordings via a delegate method, which permits calculations to be made on the fly from the data (e.g. conversion from quaternion to degrees and then calculating min/max angles, etc). See line 161 onwards here, for example.

So, following this, I think the first required stage is to make the parent recorder class, and then we can try creating a device motion recorder subclass from this. After this, we can try to access this data stream from within the active task steps.

Best wishes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants