Polynomials turn up all over the place. There are multiple good reasons for this. For one, suppose we have any continuous function that we want to study. (“Continuous” has a technical definition, although if you imagine what we might mean by that in ordinary English — that we could draw it without having to lift pen from paper — you’ve got it, apart from freak cases designed to confuse students taking real analysis by making continuous functions that don’t look anything like something you could ever draw, which is jolly good fun until the grades are returned.) If we’re willing to accept a certain margin of error around that function, though, we can always find a polynomial that’s within that margin of error of the function we really want to study. I have read, albeit in secondary sources, that for a while in the 18th century it was thought that a mathematician could just as well define a function as “something that a polynomial can approximate”.