The Hippocratic Oath for Technology

First, do no harm

Several years ago, I left Yale Medical School and moved to work at a hardware startup in San Francisco. I went from dealing with emergencies surrounding human life to dealing with emergencies surrounding money. In some ways, it was relaxing. The stakes were lower. We were distanced from vulnerable people, from patients I could accidentally infect with germs in the cracks in my fingernail polish, people that I could devastate with a harsh word or clumsy turn of phrase.

But at the same time, I felt uncomfortably far from people. I had been training for a profession that was deeply integrated in the emotions and needs of others. Now, I was in a profession where the dominant philosophy was keeping people at a distance, understanding users through surveys and A/B tests, and replacing in-person conversation with messaging platforms.

Me on a panel about narrative medicine, during school. Photo Credit: John Curtis

Me on a panel about narrative medicine, during school. Photo Credit: John Curtis

When tech entrepreneurs heard that I went to medical school, they were excited to tell me about their new app that would replace doctors, replace physical exams, or replace therapists. I never felt comfortable having these conversations. As an industry, tech was distanced from people, and it was trying to widen that distance with every new app.

Tech can try to escape from human emotional connections, but we build them, whether we mean to or not. We see it in the recent efforts of Facebook to try to better account for misinformation and negative feelings, the exposure of the addictive nature of apps and smartphones, and the news that more screen-time might affect development. Standing at a distance from the people we serve does not make them feel fewer emotions or cause less psychological fallout. It just makes it harder for us to see our own negative impacts.

A machine slanders several social groups and becomes a center for hate speech. An algorithm meant to protect users from toxicity make it harder for them to be heard. A company takes out a patent to reconstruct our lives based on eavesdropped buzzwords, we let machines read to our children, our political systems are influenced by bots and foreign agents behind screens.

Tech is powerful and dangerous, but it can also be helpful, enlightening, life-saving. So what do we do?

 

The Hippocratic Oath for Technology

Tech is an industry, and so is healthcare. But physicians, while working within a system that pursues profitability, stand steadfast in choosing to do no harm. Although users do not have to trust hospitals as a business, they do have to trust the doctors that have direct impact over them every day.

As technologists and engineers, we must understand the relationship we have with our users. We must first, do no harm. We must guard the well-being of our users over all else, and design for positive rather than negative impact. We must understand what impact means, not simply in a company-wide or technology-wide sense, but in the larger system of people, emotions, and society.

We must re-apply the Hippocratic Oath to our own work (and especially consider particularly that bit on privacy).

I'm taking my own oath now. I'll remember it, while designing, coding, making roadmaps, onboarding users. Richard and I will repeat it to ourselves as we put together the Happy Robot Company, my first real return to healthcare-adjacent fields with everything I’ve learned from technology. It's my hope that this will keeps me closer to people, to my responsibilities, and to the impact I have on their lives.

Here it is again, in case you missed it.

 

The Hippocratic Oath for Technology (Based on the 1964 Hippocratic Oath)

I swear to fulfill, to the best of my ability and judgment, this covenant:

  • I will respect the hard-won scientific gains of those technologists in whose steps I walk, and gladly share such knowledge as is mine with those who are to follow.
  • I will apply, for the benefit of the user, all measures [that] are required, avoiding those twin traps of overtreatment and therapeutic nihilism.
  • I will remember that there is art to technology as well as science, and that warmth, sympathy, and understanding may outweigh the technology-based solution.
  • I will not be ashamed to say "I know not," nor will I fail to call in my colleagues when the skills of another are needed for a user’s recovery.
  • I will respect the privacy of my users, for their problems are not disclosed to me that the world may know. Most especially must I tread with care in matters of life . If it is given me to improve a life, all thanks. But it may also be within my power to worsen a life; this awesome responsibility must be faced with great humbleness and awareness of my own frailty. Above all, I must not play at God.
  • I will remember that I do not treat a chart of growth, but a vulnerable human being, whose experience may affect the person's family and economic stability. My responsibility includes these related problems, if I am to care adequately for the user.
  • I will prevent harm to users whenever I can, for prevention is preferable to cure.
  • I will remember that I remain a member of society, with special obligations to all my fellow human beings, those sound of mind and body as well as the infirm.
  • If I do not violate this oath, may I enjoy life and art, respected while I live and remembered with affection thereafter. May I always act so as to preserve the finest traditions of my calling and may I long experience the joy of bringing technology to those who seek my help.