(This explanation assumes you are confidently capable of long division. We can add an appendix about that if one is needed.)
If you want to divide one by three, you have a problem: either you can't because 1 < 3, or you can't because the long division never ends - you just keep getting more 3s. So, we make a convention: when we have a repeating part that never ends, we just indicate what repeats and let that stand for what we would get if we could write infinity decimal places. And the nice thing is that this works - you can do addition, subtraction, multiplication, and division the same way you did before, you just have to figure out what the result and its new recurring decimal will look like.
But a weird thing happens sometimes. If you multiply one-third by three, you get one. But if you multiply 0.333... by 3, all those threes become nines and you have 0.999.... So either our nice new strategy just broke ... or we have to declare that 0.999... equals 1. But in math, you can't just declare it, you have to show that it works to do it that way.
So, this is important. All of elementary school mathematics is riding on this. Can we prove 0.999... equals 1? We know it should - a third times three is one - but can we prove it?
Here's two arguments, and I think you can make both hold up in court.
( First: can 0.999... be anything else? ) ( Second proof: let's do a little algebra. )( (closing thoughts) )