For those who have difficulty with what "on average" means, use this meaning.
Roll a single die until you have seen all six numbers, and write down the number R of rolls it took.
Repeat the process a large number of times and take the average of R.
That gives you a good estimate of "on average".
And you can see from this description that the average [not necessarily an integer] exists and is finite.
The average can be computed based on probabilities for a fair die, as Chuck Rampart did.
Some responses answered this question "How many times must one roll a single fair die to be certain all six numbers show?"
That's a different question.
No finite number of rolls will insure that.
The die could come up "1" on every roll, for example.
The difference between certainty and expectation of an outcome based on chance is basically this.
If you bet money on outcomes that have a favorable expectation, then over time you will win, and v.v.
In this case, take CR's answer [it's not an integer].
If you bet that you would show all six numbers in the next higher integer rolls of a die, you will win money over time.
If you bet that you would show all six numbers in the next lower integer rolls of a die, you will lose money over time.
Hope that helps explain logical expectation in practical terms.