With more illegal immigrants entering the U.S., a lot of Americans are beginning to think that all immigrants are taking (legal and illegal immigrants) over their opportunities. Americans are starting to feel as if immigrants are beginning to take over the jobs that are available to them and over the benefits that the government offers.
From what we have seen in the past a lot of the maintains people that clean in restaurants, hotels and in many other places, are jobs that have been given to immigrants since Americans don’t want them. Some employers don’t care if the person who is cleaning their kitchen or toilets at their restaurant is an illegal or legal employee. Many employers would just rather hire illegal immigrants for cheap labor and since they work "under the table" it will be less paper work for them as well. So why now do Americans want to take those opportunities away from these immigrants?