Human perceptual sensitivity often improves with training, a phenomenon known as "perceptual learning". Another important perceptual dimension is appearance, the subjective sense of stimulus magnitude. Are training-induced improvements in sensitivity accompanied by more accurate appearance? Here, we examined this question by measuring both discrimination (sensitivity) and estimation (appearance) responses to near-horizontal motion directions, which are known to be repulsed away from horizontal. Participants performed discrimination and estimation tasks before and after training in either the discrimination or the estimation task or none (control group). Human observers who trained in either discrimination or estimation exhibited improvements in discrimination accuracy, but estimation repulsion did not decrease; instead, it either remained similar or increased. Hence, distortions in perception may be even exacerbated after perceptual learning. We developed a computational observer model in which perceptual learning arises from increases in the precision of underlying neural representations, which explains this counterintuitive finding. For each observer, the fitted model accounted for both discrimination performance, the distribution of estimates, and their changes with training. Our empirical findings and modeling suggest that learning enhances distinctions between categories, a potentially important aspect of real-world perception and perceptual learning.