Siri may be a machine, but that doesn’t mean she has to be cold-hearted.

For years, people have turned to the automated assistant for all of life’s little questions. But big-picture stuff — like solving existential troubles and offering life advice — has never been Siri’s forte.

Now, it seems Apple is eager to change that. According to a recent job posting, tons of people turn to Siri for guidance, comfort, and support. So the company is seeking out programmers with a background in psychology and peer counseling, in order to better respond to those emotional needs.

“People have serious conversations with Siri,” states the Apple ad for a Siri software engineer. “People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind. They turn to Siri in emergencies or when they want guidance on living a healthier life.”

Yet none of today’s artificial intelligence programs are skilled at handling serious life crises. In fact, Stanford research discovered that all four widely-used conversational agents — Siri (Apple), Google Now (Samsung), S Voice (Samsung), and Cortana (Microsoft) — are incapable of answering critical questions about mental health, rape and domestic violence.

Voice-recognition programs have the potential to offer a solution to these major public health issues, the researchers assert. “As ‘first responders,’ these agents could help by referring people to the right resources during times of need,” says Dr. Eleni Linos, the senior author of the research.

Thankfully, improvements are being made. As expressed in the company’s job posting, Apple is looking for candidates to “play a part in the next revolution in human-computer interaction.” If the company can manage to find programmers who double as psychologists, we’ll soon find ourselves venting to compassionate computers and automated therapists.