American Liberalism

Date: Sep 14, 2017
Category: History Essay

The word Liberal is derived from a Latin word called liber which simply implies to be free. Liberalism can be defined as the free way of thinking and acting in private and public life. It also refers to political inclination that favors social progress by reform rather than revolution. The liberalism in the US history reached its peak in the 1960s since in this period most problems with humanity were being tackled. The war on poverty, civil rights, the rise of the new left and black power all took place during this wonderful and important era in history (Fischer 26). At the beginning of 1960 when John F. Kennedy was ascending to the presidency was the beginning of a vast and sweeping liberalism that overtook the US and promoted positive changes (Matusow 209).

However, the subsequent divisions within itself led to the unraveling of American Society. John F. Kennedy marshaled the youth into action that could embody the change that was needed in America. The war on poverty was dealt with during this period and though it seemed progressive it did not bear much fruit because many programs simply degenerated into bureaucracy (Rodney 28). The civil right societies began as non violent movements, but degenerated into violent ghetto riots due to the black power movement. The Vietnam War pictured the liberals as in favor of the cold war by uniting the opposition since the blacks saw their plight as similar to the Vietnamese and it really rocked Johnson (Matusow 212).

Organizations like Students for Democratic Society lobbied for students rights and the totalitarianism in America. They later decided to drop out of the society rather than change it. By 1968, the country was tired of change and thus Democrats lost over 12 million votes because most people felt alienated. This is what led to the Republican revolution in the seventies and eighties (Fischer, 27). American society presently is the product of liberalism in the 1960s. A vivid product is the women liberation which was championed in the early 1960s which elevated a woman to a much higher status in society. The civil rights movements changed America and with the inclusion of legislation and education (Rodney 31).