Term frequency (TF) is a measure used in information retrieval (IR) to determine how frequently a term appears in a document. The assumption is that the more times a term appears in a document, the more relevant that document is likely to be for that term. TF is calculated as the ratio of the number of times a term appears in a document to the total number of terms in that document.
For example, in a document with 100 words, if the term "machine" appears 5 times, the term frequency for "machine" would be 5/100 = 0.05. This gives an indication of how prominent a term is within a document.
TF is an important component in ranking documents during retrieval. However, on its own, TF may not be sufficient, as it does not account for the overall frequency of the term across the entire document collection. To address this, TF is often combined with inverse document frequency (IDF) to create the more robust TF-IDF metric.