News

Garber Announces Advisory Committee for Harvard Law School Dean Search

News

First Harvard Prize Book in Kosovo Established by Harvard Alumni

News

Ryan Murdock ’25 Remembered as Dedicated Advocate and Caring Friend

News

Harvard Faculty Appeal Temporary Suspensions From Widener Library

News

Man Who Managed Clients for High-End Cambridge Brothel Network Pleads Guilty

Op Eds

Calling CAMHS to Ask Courageous Questions

By Sitraka St. Michael, Contributing Writer

After a hiatus at the end of the 2014-15 academic year, the return of the Patient Satisfaction Survey for Counseling and Mental Health Services (CAMHS) users this past semester offered a hopeful sign that a recent op-ed urging CAMHS to solicit feedback from students immediately following each visit may have reached its administrators. And yet, despite this sign of progress, the emailed survey has yet to ask CAMHS end users—students—for feedback promptly and courageously.

Embedding a procedure to solicit feedback from students immediately following each CAMHS visit remains the most insightful and courageous way for UHS to assess the effectiveness of its mental health care services. The assumption that underpins the case for a prompt, onsite, post-visit questionnaire for CAMHS users is simple: students are serious thinkers. If CAMHS dares to see us as such and ask us how—not merely whether—it provided care and how it can see us more fully, it can tap into a trove of insightful feedback.

Insightful feedback requires courageous questions. The patient satisfaction survey did little, if anything, to ask students whether and—if not—how CAMHS providers failed to see us in the fullness of our life experiences when we came in for a counseling session. And if the timing and the substance of the survey are any indication, one cannot help but wonder how willing CAMHS is to ask students how it can provide care in more relevant and empowering ways.

CAMHS emailed its survey along with the deluge of evaluation requests—including Q scores about classroom experiences—that flood students’ mailboxes amidst the crush of Reading Period. Albeit understandable, this administrative decision does not seem sound or courageous. Classrooms and counselors’ offices are decidedly distinct spaces. And the dimensions of students’ experiences upon which they touch are decidedly distinct matters.

The survey’s timing is not the only problem. Its use of language is equally problematic. The phrasing of the survey’s questions is overwhelmingly in the yes or no form. Closed-ended questions tend to foreclose opportunities to hear more stories and collect more data. The survey is not immune to those cognitive and conversational consequences of yes or no questions. Indeed, many of the survey’s questions fail to incorporate follow-up questions that might have enabled courageous inquiry and provided insightful feedback for CAMHS.

One way to sustain courageous inquiry in the survey is to design questions according to the principle of segmentation. Our increasingly diverse student body confronts CAMHS with an increasing multiplicity of stories, needs, and possibilities. If CAMHS is to remain responsive to the multitudes we contain, it needs to ask follow-up questions that assess the effectiveness of its services according to the multiple segments in the student population.

Under the framework of segmentation, each survey question would come with follow-up and—this is key—segmented questions that elicit further, case-specific reflections and demographic data points from students. Here is a concrete application: A student who answers “No” when asked whether they felt like they had had a sufficient number of sessions to address their primary concern would be asked two follow-up, segmented questions offering a range of additional sessions and asking for their demographic information. Without questions that segment results by students’ years, schools, and demographics, CAMHS might find, for instance, that 70 percent of students said the current baseline number of sessions sufficed to solve their primary concern and fail to see disparities in satisfaction rates among different segments of students.

Demographic factors such as gender, nativity, ethnicity, and socioeconomic insecurity do affect the likelihood of student engagement with mental health care services. Pretending that they don’t by denying them language and place in the patient satisfaction survey will not make their embodied affect go away. The value of follow-up, segmented questions resides in the capacity they can give CAMHS to map and ultimately monitor the impact of demographic factors on the relevance of care services. The resulting map may bring up troubling and difficult results, but detecting difficulties is key to making progress. In fact, difficult discoveries based on students’ stories can strengthen CAMHS’s capacity to see how some of its current practices may keep its care providers from seeing students where they are.

It is possible both to appreciate how well University Health Services is already doing many things and to continue to highlight how it can do these—and more—things better. The return of the CAMHS patient satisfaction survey this year was a good thing. And its timing and language could have still been better and more courageous. The survey’s use of language did not enable students to reflect courageously upon their experiences at CAMHS. Nor did it ask students questions that might invite CAMHS providers to confront ongoing ways in which the tone and touch of its procedures may fail to affirm the dignity of every student and make them feel invisible.

Improving the survey requires having the courage to see and equip the multitudes we embody to embrace the multiple forms of growth it takes to get through this thing called life in the 21st century. Courageous questions ask how—and whose lives—CAMHS providers are not yet equipped to see and comprehend in the fullness of their complexities and possibilities. Many students and young alumni encounter the lifelong importance of seeing and being seen during their experiences with CAMHS. Many more may if CAMHS begins to muster the courage to ask students “How didn’t we see you?” with the help of follow-up, segmented questions promptly and systematically. This year’s patient satisfaction survey does not contain any such questions. Future surveys do not have to be like this year’s.


Sitraka St. Michael, M.Div’17, is a third-year student at Harvard Divinity School.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
Op Eds