No. If there had previously not been any alignment issues, and as long as the new tires are properly balanced, you will not need an alignment.
You could, but it would be more accurate with new tires.
Not necessarily. If the car was in alignment prior to getting the new tires, it will be in alignment after the new tires. It isn't a bad idea to check the alignment periodically and it is probably better to check it more frequently than when you get new tires.
When you change the alignment on your car, you are making sure that all of the tires are in a straight line. You can tell when an alignment is necessary because the tires will wear unevenly and your vehicle may drift.
you do not have to align your vehicle to balance the tires BUT a balanced tire will still wear if the alignment is not correct
Take the vehicle yourself to an alignment shop and let them do it.
Your alignment could be off. Meaning your tires are either facing out a bit, or facing in a bit. This can cause your tires to go bad. Your local garage should be able to do a alignment on your vehicle to straighten things up.
I would suggest parking the vehicle,you will need about $500 for the parts and labor.
new tires does not mean you don't need alignment. it is always a good idea to have the vehicle aligned after new tires are purchased to avoid premature wear, even if the car shows no sign of mis-alignment.
Tires do not get out of alignment they may be out of balance but not alignment. Your suspension however can be out of alignment. The only way to know for sure is to have it checked.
If the vehicle is front wheel drive, yes.
Your tires are worn. Get new tires and and alignment. or its a truck, learn to drive with it... seriously
That all depends on the alignment of your vehicle's tires. Some cars pull to the left as well.