Should You See a Doctor After a Car Accident Even If You Don’t “Feel” Injured?
After a car accident, you should always see a doctor as soon as possible, even if you do not believe you are injured. Symptoms of some serious car accident-related injuries may appear or may be noticed days or even weeks after the date of the accident. Seeking professional medical care as soon as possible can …