Auto insurance in California is mandatory. Almost everywhere in the US, if you possess and use a vehicle, the law requires of you to have auto insurance and the State of California also insists on this legal provision. California is a captivating place but insurance in California is one of the most expensive in the US.
Read More... [Source: Insurance: Car Auto Articles from EzineArticles.com]