I always hear people saying “Immigrants are taking all our jobs”, “Immigrants are ruining the United States”, “Immigrants are changing what America is all about”…etc. But in reality, America wouldn’t be what it is, without us. We gave the US a little
“spice,” with the variety of cultures we brought here. The same people who say those things, are out eating Mexican food, and getting some type of work done by an immigrant. They are so fast to turn their backs on people who come here in search of a better future without harming anyone, than the people who were lucky enough to be born here,and just didn’t take advantage of it. People easily think they are superior than other people just because a piece of paper tells them they belong in the US. But aren’t we all immigrants? Aren’t we all invading foreign land? No, when it comes to looking at it that way, people prefer to be blind. Just like in the article I stumbled upon, where they want to blame immigrants for making US citizens jobless, and poor, instead of actually looking at what they do wrong to be in the situation they are in. Maybe, they are too lazy to actually work, and prefer to ask for money at a red light. Or maybe they just don’t want to work for the pay the employer offers. So that’s when the immigrant comes in, and they do want to work, and accomplish success. So they are willing to work in any profession and get paid any amount, as long as they can provide for their families. People who are US citizens are able to receive endless benefits from the government, which also provides them with the excuse of not having to go and work, because they have other ways of receiving help. Unlike an illegal immigrant, who has to work and do everything they can to make some money just to be able to afford food for their family. Of course there’s exceptions to everything that I’m saying, but, we have to open our eyes to beyond ILLEGAL OR LEGAL
That is not the issue….