I never really thought too much about whether telling my children that there was a Santa Claus who brought them gifts on Christmas Eve. I believed in Santa, so where was the harm in it all? I was on a site recently that I frequent, and on one of the forums a parent was talking about howthey do not "do Santa" in their home. Then as I continued to read the other posts I saw many other parents who didn't "do Santa". They all had their reasons: some were religious, some were simply about trust.
It got me thinking, was there any harm in allowing my children to believe in Santa? Would they be scarred, and unwilling to trust the things I tell them?
As far as not talking about Santa because of religious beliefs, that does not have much affect on whether I tell my children that there is a bearded, old man in a red suit that brings them gifts the night before Christmas. I was raised Christian, and my parents still told me there was a Santa. I was upset when I found out he wasn't real though. ha.
But I guess I sometimes get stuck in my own bubble. I do not really realize the contraversies around things I have always viewed as harmless fun, and in a way tradition. We are still telling our kids about Santa. There is not much talk about him, but my children get photographed with Santa, and they think that he brings them presents on Christmas Eve. We have fun with it, and the kids enjoy it.