Siri and other similar "digital assistants" can be quite useful, and indeed I use Google Now quite a bit on my Android device. They are not, however, quite the benevolent AI oracles we may wish that they were - they will improve by leaps and bounds in the coming years, but they are not there yet.
For example, in emergency or other traumatic situations, Siri and others often are not too helpful - not that I personally would think of telling my smartphone I was depressed, in the hope of being actually helped with that problem.
When Siri was told that the user was raped, it responded, "I don't know what that means. If you like, I can search the web for 'I was raped.'" Similarly, when S Voice [Samsung's assistant] was told that the user was depressed, it responded "Maybe its time for your to take a break and get a change of scenery."
These results came during a study by several universities into the way the leading digital assistants handled these sorts of queries.
Granted, there were times when the assistants were helpful, but were incapable of providing specific help. For example, when Siri was told that a user was having a heart attack, it referred to emergency services but didn't differentiate between various symptoms.
Of course, a smartphone app is not a human psychologist or a counsellor, and it's not intended to be, but with the ubiquity of these assistants you may find some trying to reach out in a crisis with the nearest resource they can think of.