Crowdsourcing refers to distributing microtasks to an unknown group of online workers. Given that workers have varying expertise levels, a major research challenge for crowdsourcing is solving the problems of untargeted task assignment and unestimated aggregation of results. Although existing approaches can estimate the expertise of workers and use expertise information to allocate tasks, the effectiveness of these approaches is limited for the following reasons: 1) reliance on human intervention; 2) dependence on the type of answers; 3) non-sparseness; 4) post-expertise estimation. To overcome these limitations of crowdsourcing, this paper introduces an unsupervised approach to expertise estimation in microtask crowdsourcing that is independent of answer type, which is named ROUgh set based eXpertise estimation (ROUX). We consider the problem of expertise estimation as a metaheuristic optimization search problem, and integrate it with a rough set to better estimate the expertise of each online worker. Further, ROUX uses the expertise rating of workers for task assignment to maximize the accuracy of the results and lower the cost. Extensive experimental evaluations using real-world datasets show that ROUX performs remarkably in accuracy improvement and cost efficacy.