Winter Tires -- Are They Necessary?
If you see a lot of winter weather where you live and drive, you may wonder if it's necessary to put winter tires on your car, or maybe just get by with a good set of all-season tires. The folks over at Consumer Reports are wondering the same thing, and the answer is...it depends.
Even if your vehicle has all-wheel drive, winter tires deliver better stopping grip on snow and ice than most all-season tires. They are the right choice if you commonly drive in wintry weather, or if you just want a greater piece of mind.
On the other hand, if you live in an area with infrequent winter weather or can wait for the roads to be cleared after a storm before heading out, all-season tires are a better choice.
At Winter Tires
Read More in: Emergency Supplies | Tips and Tricks
Share this Article with others:
Came straight to this page? Visit Weather Snob for all the latest news.
Posted by Super Cool Pets Staff at November 17, 2011 1:00 AM