Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to capture / store other interactions / communications beyond the patient agent #12

Open
JRPearson500 opened this issue Dec 30, 2021 · 1 comment
Labels
enhancement New feature or request

Comments

@JRPearson500
Copy link
Member

As well as generating a patient health record update, the Intelligence Layer could produce data to accompany a particular
interaction - such as the image of a scan, or a discharge letter for a patient.

Currently we have not modelled an interaction function that can produce such data - however, each environment has an attribute
patient_data that could be used to store such information.

Patient_data is chosen as a default dict(list) python type so that the keys can be set as the patient_id, and the values are lists
containing the data. We envisaged it could be used as follows:

  • Patient and environment interaction using interaction function - interaction generates some data, for example, a scan
  • The data could look like a dictionary:
    {
    real_time: 2021-03-24 15:55:25,
    patient_time: 2021-03-24 15:55:25,
    environment_database_name: PACS,
    visible_to_environment_ids: [1, 3, 15],
    patient_record_indices: [22, 23, 24],
    interaction_name: write_letter,
    content_type: image,
    content: <scan.png>,
    }
  • This would then be appended to the patient_data[patient_id] list
@JRPearson500 JRPearson500 added the enhancement New feature or request label Dec 30, 2021
@JRPearson500
Copy link
Member Author

JRPearson500 commented Dec 30, 2021

In the Alpha data model, since only one patient and one environment are passed into an interaction function, an environment can only view its own patient_data information.

A future version which gives the interaction function access to all environments would mean that environments can view each other’s data (if visible_to_environment_ids permits it).

Note - we are assuming that the content of the data (letter or scan) can be represented in some useful abstract manner for this to be possible. If an actual letter or image is required, generating such an output would be far more complex and probably require a separate data model to produce such data at the end of the simulation. Note the following:

  • A truly realistic letter or image may not be useful for guiding a patient along a pathways during the simulation.
  • The images here are messages passed between nodes that simulate what happened in the real world (when a
    Consultant receives a diagnostic image).
  • Actual images are not being used in the model, so would be an enforced addition with relatively high complexity for low
    precision.
  • Actual images would require (a) an image generation model which needs to be realistic (b) a component which would
    be able to interpret the image content and make decisions and (c) a larger amount of memory to store the image. For
    the purpose of the ABM, only certain attributes of the image are required, such as metadata. We recommend this
    approach.

If the ability to model letters was implemented then the ABM could handle more general interactions such as private healthcare referrals, widening the scope to include patient data outside the NHS.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant