Predictive Care Blog

Predictive Care Collaborative Predictive Care Collaborative

My Trainee Experience in Mental Health Research: Robert Xiao

During my two co-op terms from May to December 2022, I worked with Dr. Maslej on the project examining the impact of using machine learning to enhance the assessment of challenging patient behaviours in acute psychiatric care, such as violence or aggression…

During my two co-op terms from May to December 2022, I worked with Dr. Maslej on a project examining the impact of using machine learning to enhance the assessment of challenging patient behaviours in acute psychiatric care, such as violence or aggression. I’ve helped with integrating, cleansing, and processing several CAMH’s Electronic Health Records datasets and have built a functioning predictive model to identify potential inpatient violence based on predictors like age, diagnosis, and DASA scores.

"My time at CAMH has not only sharpened my technical skills in terms of data analytics and visualization but also allowed me to witness the transformative power of data science in a multidisciplinary setting." I keep saying this to all the interviewers when introducing what I have learned from my internship during the past job-hunting season, but it’s way more than that. Frankly, with a purely computing technology background, working in a mental health hospital was not something I initially expected, but it turned out to be an experience that I really cherish. I chose computer science as my major partly because of my misconception about the term ’computer science’ – now I finally have my dream fulfilled, to use computers to do some science, to contribute another step towards health care equality, to not only just writing code but also make a real impact to society. This would not be such a meaningful journey without the guidance and support of Dr. Marta Maslej, Dr. Laura Sikstrom, Dr. Juveria Zaheer, Zoe Findlay, and all the other amazing people on the team.

As for my two cents on AI in mental health, my other work at CAMH involves transforming current datasets with a uniform medical information standard. With the blossoming of large language models, I think what we can expect in the future is a globally standard, transparent LLM-powered knowledge base, where data and knowledge can be easily shared and queried for research and clinical purposes. Then, other AI tools "can analyze vast amounts of data to identify patterns associated with mental health issues, enabling timely intervention and support for individuals at risk, helping to prevent the escalation of mental health issues and improving overall outcomes. Additionally, AI can assist mental health professionals in providing personalized and data-driven treatment plans for individuals, enhancing the effectiveness of therapeutic interventions", says ChatGPT.

Read More
Predictive Care Collaborative Predictive Care Collaborative

A Year of Data Integration Insights

Reflecting on this past year, the evolution of our team’s discussions has led to invaluable insights. Our interdisciplinary team has uncovered new perspectives and pathways…

Reflecting on this past year, the evolution of our team’s discussions has led to fascinating insights. Our interdisciplinary team has uncovered new perspectives and pathways, ultimately shaping our research approach and deepening our insight into the dynamics of emergency psychiatric care.

Our initial exploration into the theme of insight shed light on its clinical significance. This concept highlighted the interplay between clinician authority and patient autonomy, setting a foundation for our subsequent discussions, and calling to attention the complexities of how care is administered and received.

From there, our discussions naturally progressed to the intertwined dynamics of time and trust in emergency care. We uncovered how trust is deeply rooted in past experiences, influencing both patient and provider expectations and behaviour. This insight into temporal relationships in healthcare brought forward an understanding of the subjective nature of decision-making and risk assessment, revealing underlying biases that can shape interactions.

Our conversations then evolved, focusing on the balance between agency and structural elements in psychiatric care. We recognized that patient pathways shaped are by more than clinical interactions, but also by broader social support systems and existing systemic inequities. This broader view allowed us to delve deeper into the physical and emotional nuances of the emergency department, acknowledging the intersection of personal and professional spaces. In doing so, we gained a clearer understanding of thresholds in client-provider dynamics – often subjective boundaries between trust and distrust, and how they are navigated in care as providers balance therapeutic relationships and administrative work.

Transitioning to interfaces, we examined how their design and functionality, both digital and manual, play a role in maintaining these critical boundaries. The use of interfaces, essential in shaping trust and communication, can lead to various forms of decision-making under pressure. This discussion segued into our focus on the novel approach of human-AI teaming, where we discussed the possibility of Artificial Intelligence (AI) to augment decision-making in healthcare, particularly in assessing risks, and whether it can help mitigate biases we have identified through our research.

These themes led us to further discuss the role of police in emergency psychiatric care and their influence on risk assessment. During mental health crises, the presence and actions of police can significantly impact the emotional landscape for both clients and providers, revealing a delicate balance when police intersect with healthcare. This naturally brought us into broadening our understanding of strong emotions in the emergency department. One of those emotions, disgust, although more uncommon than others, can subtly, yet powerfully, influence patient-provider interactions and decision-making, particularly in complex scenarios involving societal norm violations.

Concluding our year, we turned to the field of Computer-Supported Cooperative Work (CSCW), focusing on articulation to highlight the importance of effective coordination and communications within a mental healthcare team. This final theme tied together our previous discussions, emphasizing the need for systems that support empathetic and efficient decision-making while mitigating biases.

This progression through themes has led us to a deeper understanding of the healthcare ecosystem and the potential for responsible use of AI in mental health care. As we continue to explore this avenue, we’re eager to better understand how to make more fair, compassionate, and equitable patient-centered care choices for innovative approaches to improving care.

Read More