I remember when I was growing up, I never really understood the purpose of car insurance. I knew it was one of those things that my parents said was necessary but I didn’t know why. I remember that they once explained it to me as that they bought insurance on the idea that they would be protected in the event that something happened, but that the companies were there, hoping you didn’t need them, but were there if you did. It seems like a pretty simplistic explanation, but as a child that’s all I needed to comprehend.
Something I’ve always found interesting is that the insurance laws across the United States aren’t all the same. Some states don’t actually require that you carry insurance, while with others it’s manditory. I’ve also found out through living in different places that the prices aren’t always the same either. One of my best friends found cheap auto insurance in Texas, but another friend, who lives in another State found similar coverage for a great rate, but also more expensive than that in Texas, simply because she lives in New York City, and the rates are a little higher than a rural area in Texas. I’ve always found the entire system to be a little complicated, but I guess because I grew up knowing and hearing the importance of things like car insurance, home insurance and even life insurance, I can’t imagine NOT having it. I know to some people it isn’t a necessity, but I think it should be.
What about you? How were you raised and how do you feel about insurance?