← All Posts

Disclosing Personal Information? It May Be Less Embarrassing To Tell It To a Computer Than a Doctor

2014-07-15

HealthPsychologycommunicationdataself-reportuser experience
That's personal!Most of the data I work with is self-report, provided by a user to a database via a device like a computer or a mobile phone. No live counselor or coach processes that information before it's crunched in the database and appropriate content selected for the user to read. There are drawbacks to this method to be sure. We don't have the luxury of interpreting non-verbal cues like facial expression or tone of voice that could give nuance to a user's words. We can't be as sensitive about follow-up questions as we would be in a live conversation, since any follow-ups and their associated skip logic are pre-written. And we don't allow users an opportunity to add color commentary, which leads to occasional frustrated feedback from users who really want to explain their specific circumstances related to their health. One drawback this type of data does not have as much as people might expect though, is veracity. Surprisingly, when talking to the computer, users don't lie. In my work, we've often found that when we compare self-reported health data to verified sources like medical claims or health records, people are generally pretty honest with what they tell the computer. Now another new report confirms that people are not only pretty honest with self-report, but they may be even more comfortable telling embarrassing health information to a computer than a person. As a psychologist, I think this makes total sense. There are many factors that would make it more difficult to tell a human being embarrassing health information than a computer, such as: The last point, about privacy, also ties into why it is so important that the infrastructure around your web intervention be well-organized and -communicated. Terms of service and privacy agreements are critical. These documents are notoriously long and impractical for the average user to sift through, and it may not be possible to radically remodel them in the short term while maintaining appropriate legal protection. However, especially after conducting many rounds of user testing, I think it's important to create a condensed, user-friendly "cheat sheet" to the terms of service and especially the data privacy for any online intervention. The questions it should answer include: In the USC research, participants' willingness to disclose information was directly related to their belief that their information would be private and protected. If we want to truly recognize the value that online coaching interventions could bring to users, it's critically important that we think about the often taken-for-granted infrastructure around them and revise them to give users a better level of education, understanding, and comfort.