What Is Auto Insurance?

Auto insurance is the agreement between you and your insurer that provides protection against financial loss in a number of scenarios. By paying an insurance premium, the insurance company agrees to provide property, liability, and medical coverage. Personal auto insurance is mandated in almost all U.S. states, but there are many options available to your needs and financial situation.