But as a BBC reporter working in California — he has found that many Americans see him differently. ... read more
Although, as JJ Merelo states, providing a complete answer is quite hard, my guess is that Spain is, in fact, proud of Hispanic America (that is, if you count the whole continent). read more
Hispanic America became the main part of the vast Spanish Empire. Napoleon's takeover of Spain in 1808 and the consequent chaos initiated the dismemberment of the Spanish Empire, as the Hispanic American territories began their struggle for emancipation. read more