This is really a general question. Why is it that some people believe you should never let anyone know you're sick? I'm not saying it's good to be an open book by any means. But I find myself, that it's good to have a few close that know. I was originally diagnoses with UC at 18 years old. My father was a doctor of internal medicine and had been practicing for over 20 years so he was well known. In our family, it was pretty much not allowed to be sick, and I've heard that from children of other doctors who have the same experience. Basically, anytime you were sick dad would say it's nothing and to just get on with your life. Even when dad developed his cancer, he didn't tell anyone (not even his wife) for over a year, until it became too difficult to hide. Just thought I'd ask to see what people think. I will never understand why some people feel the need to be so secretive about things so important. Maybe it's just some odd trait seen more among doctors? Seems acting like this only makes things worse for everyone so I've never understood the rationale.