English Wikipedia - The Free Encycl...
Download this dictionary
Liberalism in the United States
Liberalism in the United States is a broad political philosophy centered on the unalienable rights of the individual. The fundamental liberal ideals of freedom of speechfreedom of the pressfreedom of religion for all belief systems, and the separation of church and state, right to due process, and equality under the law are widely accepted as a common foundation across the spectrum of liberal thought.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License