What is UNITED STATES OF AMERICA?

the name that is given to the union of all of the states under the US Constitution where government control is vested in the people of the states.

More On This Topic




Link to This Definition

Did you find this definition of UNITED STATES OF AMERICA helpful? You can share it by copying the code below and adding it to your blog or web page.
Written and fact checked by The Law Dictionary