Yes, it's the Right Wingers who try to redefine Patriotism and American Values. The United States is a Liberal country, founded by Liberal men based on Liberal principles.
Yes, such as limited government, with the states holding most of the power, no direct taxes, Free Market economy, unlimited freedom of the press and religion, that every citizen is a member of the milita and as such has the right to own a gun, and there is no need for a standing army (currently sitting in over 35 foreign nations right now I might add).
Liberals have changed this country just as much as the conservatives, if not more so.