THE AMERICAN INVASION OF THE.WEST INDIES
... THE AMERICAN INVASION OF THE WEST INDIES. (by a colonist.) The British Colonies in the West Indies, after a long period of depression, are again beginning to show signs of returning prosperity. This is evi- dent, not only in connection with the old settled ...