Employee rights in the United States include receiving legal and agreed-upon wages, working in physically safe conditions, and being free from harassment.